Improving the quality of education in developing countries: An experimental evaluation of teacher training programs in El Salvador

Last registered on September 27, 2022

Pre-Trial

Trial Information

General Information

Title
Improving the quality of education in developing countries: An experimental evaluation of teacher training programs in El Salvador
RCT ID
AEARCTR-0010035
Initial registration date
September 21, 2022

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
September 27, 2022, 11:03 AM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Primary Investigator

Affiliation
University of Bern

Other Primary Investigator(s)

PI Affiliation
University of Bern
PI Affiliation
University of Bern
PI Affiliation
University of Bern
PI Affiliation
University of Bern

Additional Trial Information

Status
On going
Start date
2022-04-04
End date
2023-11-10
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
Quality education is one of the Sustainable Development Goals advocated by the United Nations, but many developing countries are still far from reaching this target. In the last decades, low- and middle-income countries have made impressive progress in raising school enrollment. Yet, their productivity in converting educational investments into human capital remains low, as international student assessments highlight. In response to these findings, the World Bank dedicated its World Development Report 2018 to what was declared a global “learning crisis”. Recent data from Africa, Asia, and Latin America shows that poorly qualified teachers – both in terms of pedagogical knowledge and content knowledge – are a key barrier to more effective schooling systems. The available evidence even suggests that the learning crisis in developing countries is, to a large degree, a direct consequence of a teaching crisis. Without joint efforts, this situation is likely to reproduce itself: Many of today’s poorly qualified teachers will continue teaching for years to come and consequently shape tomorrow’s teachers. Despite a growing consensus that inadequate teaching quality lies at the heart of the learning crisis, potential solutions to address the issue have remained understudied.

Teacher training programs may be a promising strategy to cut through the outlined vicious cycle that plagues many schooling systems. The main goal of our project is to assess the potential of such programs to raise student learning outcomes in a context that is characterized by a twin deficit among teachers: a lack of both pedagogical knowledge and content knowledge. We further aim to analyze how gains in teachers’ competencies are passed on to students, and how training programs should be designed to optimize their effectiveness. We are particularly interested in quantifying the relative efficacy of pedagogical and content-related training elements, and whether combining them unfolds relevant complementarities.

Our team of economists, sociologists, and educational scientists has designed a randomized controlled trial (RCT) to be rolled out across 338 primary schools in El Salvador. Its core features three teacher training programs focusing on either (i) teaching skills (didactics), (ii) content knowledge, or (iii) a combination of both inputs. During one school year, 254 primary school math teachers will participate in one of these training programs that are planned to share a common basic framework combining face-to-face meetings, coaching elements, and self-study modules. To quantify the impact of the interventions, we plan to collect comprehensive data on teacher competence (i.e. content knowledge & teaching practices) as well as student learning outcomes in math across two consecutive school years.
External Link(s)

Registration Citation

Citation
Brunetti, Aymo et al. 2022. "Improving the quality of education in developing countries: An experimental evaluation of teacher training programs in El Salvador." AEA RCT Registry. September 27. https://doi.org/10.1257/rct.10035-1.0
Sponsors & Partners

Sponsors

Experimental Details

Interventions

Intervention(s)
The intervention features three teacher training programs focusing on either (i) teaching methods (didactics), (ii) content knowledge, or (iii) a combination of both inputs. Between mid May 2022 and early October 2022, primary school teachers participate in one of these professional development programs that share a common basic framework combining (a) seven face-to-face meetings, (b) coaching elements, (c) and six self-study modules.
Intervention Start Date
2022-05-21
Intervention End Date
2022-10-01

Primary Outcomes

Primary Outcomes (end points)
Student learning outcomes in mathematics measured via pencil and paper assessments.
Primary Outcomes (explanation)
The student assessments mirror the Salvadoran curriculum of grades 1 to 5. In particular, test items are based on local text books and international assessments, then mapped to the Salvadoran curriculum, and finally validated through local experts. We will administer different (and grade-specific) tests in each assessment cycle. Yet, overlapping items will enable us to link results across the different test booklets and allow for a projection of learning progress on a common scale through Item Response Theory (e.g. de Ayala, 2009). All assessments take 60 minutes and will be administered within students’ classrooms by a team of trained local field workers.

To quantify learning outcomes we will (a) compute the share of correct answers, (b) standardize the share of correct answers to have a wave-speci c mean of zero and a wave-speci c standard deviation of one in the control group, and (c) use Item Response Theory to compute a mathematics score that has a common scale for teachers, and students of all grade levels.


Secondary Outcomes

Secondary Outcomes (end points)
Teacher content knowledge in mathematics (assessment) and teaching practices (classroom observations)
Secondary Outcomes (explanation)
The teacher assessments are designed to capture teachers’ content mastery of the Salvadoran primary school curriculum in math. The tests comprise validated items from different subject domains and grade levels and partly overlap with the student tests, allowing to represent student and teacher outcomes on the same scale. Teacher tests take 90 minutes and are administered during regional meet-ups.

To quantify learning outcomes we (a) compute the share of correct answers, (b) standardize the share of correct answers to have a wave-speci c mean of zero and a wave-speci c standard deviation of one in the control group, and (c) use Item Response Theory to compute a mathematics score that has a common scale for teachers and students.

To measure teaching skills, we conduct classroom observations during and after the intervention. In unannounced visits, enumerators will collect data on teachers’ teaching practices using an adaption of the Stallings classroom observation system.

Experimental Design

Experimental Design
The intervention features three teacher training programs focusing on either (i) teaching methods (didactics), (ii) content knowledge, or (iii) a combination of both inputs. Between mid May 2022 and early October 2022, primary school teachers participate in one of these professional development programs that share a common basic framework combining (a) seven face-to-face meetings, (b) coaching elements, (c) and six self-study modules.

1. In February / March 2022, we conduct school visits in the Salvadoran Departments (i) La Unión, (ii) Morazán, (iii) San Miguel, and (iv) Usulután to inform school heads and teachers about the professional development program and the study. Teachers register voluntarily with the representative conducting the school visit or later via phone.The target population comprises all teachers that instruct math to at least one class in grades four or five in 2022.

2. We invite those teachers that voluntarily registered to participate in the program to the baseline teacher assessment (April 2022). At the baseline assessment, (i) teachers provide written consent to participate in the program and study, (ii) take a 90 minutes mathematics assessment, (iii) fill in a socio-demographic survey, and (iv) verify/provide information on their weekly teaching timetable.

3. Informed by our power calculations and based on the list of participants at the teacher assessment, we select 338 teachers to participate in the study. To avoid distortion of treatment effect estimates via spillovers (e.g. Miguel & Kremer, 2004), the teachers are drawn from 338 different schools.

4. We conduct the baseline asssessment (+ short survey) with the fourth and fifth grade classes that are taught in mathematics by teachers in our study sample (April/May 2022). For each teacher, one class per target grade levels 4 and 5 is assessed. Note that some teachers only teach fourth graders (or fifth graders). Very few teachers instruct multiple classes on the same grade level; for those teachers, students of stream A, e.g. students of class 4A rather than class 4B, participate in the assessment.

5. During the student baseline assessment we randomize teachers into treatment and control. Randomization is conducted by the research team using the software R. The random assignment is stratified by region and terciles of baseline teacher performance; overall, this yields 12 strata (4 regions x 3 performance terciles).

6. Teachers are informed about their assignment to one of four experimental groups (T1, T2, T3, Control).

7. The three treatment arms are implemented from 21 May to 1 October.

8. Between mid August and mid September 2022, enumerators conduct classroom observations in one mathematics lesson (45min or 90min) of each teacher in the study sample; they use an adapted version of the Stallings Tool to collect information.

9. Midline assessments with students are conducted in October 2022.

10. Midline assessments with teachers are conducted in November 2022.

11. Depending on how many classes are assigned to new teachers between the two school years 2022 and 2023, a mid-implementation "baseline" student assessment is conducted in February 2023.

12. In August 2023, enumerators conduct classroom observations in one mathematics lesson (45min or 90min) of each teacher in the study sample; they use an adapted version of the Stallings Tool to collect information on teaching practices.

13. Endline assessments with students are conducted in October 2023.

14. Endline assessments with teachers are conducted in November 2023.
Experimental Design Details
Not available
Randomization Method
Randomization is conducted by the research team using the software R. The random assignment is stratified by region and terciles of baseline teacher performance; overall, this yields 12 strata (4 regions x 3 performance terciles).
Randomization Unit
Teachers (clusters)
Was the treatment clustered?
Yes

Experiment Characteristics

Sample size: planned number of clusters
338 teachers
Sample size: planned number of observations
6084 (18 pupils per teacher/cluster)
Sample size (or number of clusters) by treatment arms
85 teachers in treatment arm 1 (training in didactics), 85 teachers in treatment arm 2 (training in mathematics), 84 teachers in treatment arm 3 (training combining didactics and mathematics), and 84 teachers in the control group.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
MDES for the mathematics outcome at the student level (alpha=0.05, power=80%): 0.12SD -- 0.16SD. We use a range of plausible values for the intracluster correlation, R2between, and the R2within informed by data from El Salvador and the Optimal Design Software (see Spybrook et al. 2011).
IRB

Institutional Review Boards (IRBs)

IRB Name
The Ethics Committee of the Faculty of Business Administration, Economics and Social Sciences of the University of Bern
IRB Approval Date
2020-11-23
IRB Approval Number
222020