x

Please fill out this short user survey of only 3 questions in order to help us improve the site. We appreciate your feedback!
Improving the Quality of Preschool Education in a developing economy
Last registered on June 04, 2019

Pre-Trial

Trial Information
General Information
Title
Improving the Quality of Preschool Education in a developing economy
RCT ID
AEARCTR-0002340
Initial registration date
May 29, 2019
Last updated
June 04, 2019 12:17 PM EDT
Location(s)

This section is unavailable to the public. Use the button below to request access to this information.

Request Information
Primary Investigator
Affiliation
PUC-Chile
Other Primary Investigator(s)
PI Affiliation
IADB
Additional Trial Information
Status
On going
Start date
2016-09-01
End date
2021-06-30
Secondary IDs
Abstract
This trial evaluates a policy package that was applied to early childhood centers in Panama. The package that all treatment centers received included teacher training sessions, educational materials, a monthly money transfer for the center, and close monitoring and help from the central authority. The package also includes important improvements in infrastructure.

For evaluation purposes, we randomized the order in which 89 child care centers administered by the government will receive the intervention. This will allow us to identify the impacts of the intervention on the centers. We are interested in how the policy affected the quality of classroom interactions, especially given that the literature shows that it is not easy to make improvements in this important outcome. Other potential outcomes of interest are the levels of attendance to the center and measures of structural quality.
External Link(s)
Registration Citation
Citation
Hojman, Andrés and Sebastián Martínez. 2019. "Improving the Quality of Preschool Education in a developing economy." AEA RCT Registry. June 04. https://doi.org/10.1257/rct.2340-1.0.
Former Citation
Hojman, Andrés, Andrés Hojman and Sebastián Martínez. 2019. "Improving the Quality of Preschool Education in a developing economy." AEA RCT Registry. June 04. http://www.socialscienceregistry.org/trials/2340/history/47510.
Sponsors & Partners

There are documents in this trial unavailable to the public. Use the button below to request access to this information.

Request Information
Experimental Details
Interventions
Intervention(s)
Our partner, Ministerio de Desarrollo Social, has been implementing different improvements to the centers, starting in 2017. All treatment centers received teacher training sessions. There have been several sessions, covering topics going from general management, quality standards, and teacher-child interactions. They also received a monthly money transfer for the center. These transfers were given in addition to the centers' usual revenues (parents' payments, a per-child food payment, basic educational materials, and minor repairs). The transfers could only be spent on educational materials or other education-related expenditures.

The centers also received close monitoring and help from the central authority for the implementation of 10 quality standards. The package also includes important improvements in infrastructure.
Intervention Start Date
2016-09-01
Intervention End Date
2020-12-31
Primary Outcomes
Primary Outcomes (end points)
1. Dimensions covered by our quality questionnaire (ITERS):
- Interaction (primary outcome)
2. The main score in the Caregiver Interaction Scale (CIS)
3. Attendance to the center
Primary Outcomes (explanation)
For comparability with previous literature, we will use the original dimensions of the ITERS instrument as they are reported by default.
Secondary Outcomes
Secondary Outcomes (end points)
1. Other dimensions covered by our quality questionnaire (ITERS):
- Space and Furniture
- Personal Care Routines
- Talking and Speaking
- Activities
- Program Structure
- Parents and Staff
Secondary Outcomes (explanation)
Attendance is not trivial to construct, because mechanically some of the treatment centers are increasing their capacity after the infrastructure changes. Thus, we need to take that effect into account. We plan to measure attendance in three ways:
- The number of children attending over the capacity of the center.
- The number of children that kept attending the center regularly throughout the year over the number of children originally enrolled.
- We will analyze how attendance to the centers changed since randomization, to see if there are more children attending the treatment centers after the intervention starts, even controlling for the mechanical change in capacity. For example, we will be interested in studying if the randomization determines attendance in a panel of centers, even after controlling for the capacity of the center (which depends on the randomization).
Experimental Design
Experimental Design
Our original potential sample included all child care centers managed by the government. However, MIDES decided that there were a few (originally 5) that they did not want to be part of the evaluation, because there were previous commitments to improve them. Thus, we ended up with a list of 89 centers to randomize.

Given that we also wanted to run a child-level evaluation in three arms (see the registered trial "The impact of attending a higher quality preschool in a developing economy"), we divided the sample of centers into two groups: the "couples" and the "independent". The couples were pairs of centers that were geographically close to each other. The independent ones were all the rest. We wanted one center from each couple to be part of the treatment group and the other to be part of the control group.

We used a stratified randomization schedule. We included "being in a couple" to define our strata in the randomization. The other variable used to define the strata was urban/rural. For the couples, we randomized which of the centers was to be part of the treatment group. Then, we randomized the priority for all the treated centers, giving their couples the inverse priority (so for example, the control center that was a couple of the first center to be treated received the last priority). For the independent centers, we allocated all of them to be treated after the treated members of the couples, and before the control members of the couples. Within that range, we randomized their priority.
Experimental Design Details
Not available
Randomization Method
Public Lottery (in the presence of all of the child care center supervisors)
Randomization Unit
Child care centers
Was the treatment clustered?
No
Experiment Characteristics
Sample size: planned number of clusters
89 centers
Sample size: planned number of observations
89 centers
Sample size (or number of clusters) by treatment arms
44 are treated, 45 centers are controls
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Assuming power of 0.8, significance level of 0.05, the minimum detectable effect (MDE) in a one-sided test will be 0.57 standard deviations. This includes the correction for non-compliance, given that two of the control units received some treatment. It also assumes we will only obtain data on 86 centers.
IRB
INSTITUTIONAL REVIEW BOARDS (IRBs)
IRB Name
Instituto Biomédico Gorgas
IRB Approval Date
2018-09-28
IRB Approval Number
N/A