Healing After Conflict. Using Edtech Solutions and Expert Mentoring to Improve Mental Health and Learning for Students in Armenia

Last registered on April 09, 2026

Pre-Trial

Trial Information

General Information

Title
Healing After Conflict. Using Edtech Solutions and Expert Mentoring to Improve Mental Health and Learning for Students in Armenia
RCT ID
AEARCTR-0016497
Initial registration date
August 01, 2025

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
August 04, 2025, 6:24 AM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
April 09, 2026, 9:28 PM EDT

Last updated is the most recent time when changes to the trial's registration were published.

Locations

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Primary Investigator

Affiliation
The World Bank

Other Primary Investigator(s)

PI Affiliation
The World Bank
PI Affiliation
Harvard University

Additional Trial Information

Status
On going
Start date
2025-01-01
End date
2027-03-31
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
In conflict-affected settings, schools become the primary institution responsible for supporting children's psychosocial recovery, effectively expanding the education production function beyond academic learning to include mental health and social-emotional development. Yet teachers in these contexts are typically trained only to deliver academic content and lack the skills to identify or address the psychological needs of trauma-exposed students. This mismatch between what schools are expected to produce and what teachers are equipped to deliver represents a critical deficit in the human capital of the teaching workforce—one with potentially compounding consequences, as untreated mental health conditions impair concentration, attendance, peer relationships, and learning. Understanding whether investments in teachers' psychosocial capacity can improve student outcomes, and what complementary inputs are needed, is a first-order question for education policy in the growing number of countries affected by conflict and crisis.

This paper investigates these questions through a randomized controlled trial across 450 schools and over 30,000 students and more than 3,800 in conflict-affected regions of Armenia. The core intervention provides nine months of specialized mentoring designed to equip teachers with the skills to integrate mental health and psychosocial support (MHPSS) into classroom practice. Trained mentors work directly with teachers over three phases of decreasing intensity, building capacity to recognize signs of distress, implement supportive classroom activities, and refer students with greater needs to specialized psychological services. This component directly targets the teacher skill constraint.

To test whether the returns to these skill investments depend on the information environment in which teachers operate, we cross-randomize access to a technology platform within the treatment group. The platform collects structured data from mentor observations and generates tailored, data-driven activity recommendations for individual teachers. By reducing the cost of acquiring and processing student-level information, the platform may allow teachers to allocate their newly acquired psychosocial skills more efficiently, matching support to the students and situations that need it most. The cross-randomization design allows us to distinguish between two hypotheses: that the binding constraint on effective psychosocial support in schools is teacher skill alone, or that even skilled teachers face information frictions that limit the productivity of their efforts.

Our primary outcomes are students' mental health and academic performance. We trace mechanisms through social-emotional skills, teacher support, peer environment, and educational aspirations. We also examine teacher-level outcomes to understand whether the intervention transforms how teachers engage with their expanded role. The findings contribute to the literature on teacher effectiveness and the production of non-cognitive skills, the economics of mental health in developing countries, and the broader question of when technology complements—rather than substitutes for—human expertise in education.
External Link(s)

Registration Citation

Citation
Dinarte, Lelys, Renata Lemos and Rony Rodriguez Ramirez. 2026. "Healing After Conflict. Using Edtech Solutions and Expert Mentoring to Improve Mental Health and Learning for Students in Armenia." AEA RCT Registry. April 09. https://doi.org/10.1257/rct.16497-2.0
Experimental Details

Interventions

Intervention(s)
The intervention is a school-based mental health and psychosocial support (MHPSS) program designed to integrate psychosocial tools into classroom instruction for students in grades 5 through 7. Developed in partnership with the Armenian Ministry of Education, Science, Culture and Sports (MoESCS) and the Teach for Armenia Foundation, the program targets both displaced children from Nagorno-Karabakh and host students in Armenian public schools. The program deliberately avoids explicit "mental health" framing in school communications, a design choice made in consultation with the Ministry to navigate stigma and facilitate parental acceptance.

The program operates through a mentor-centered delivery model and consists of three components:
Component 1: Mentor-led teacher training and in-classroom support. Trained mentors work alongside teachers to build their capacity to integrate MHPSS activities into daily instruction. The program draws on a database of over 62 psychosocial activities spanning different subjects, grade levels, durations, and activity types. These activities were designed with the support of international and local mental health experts and local curriculum specialists, taking the roll-out of the current national curriculum into consideration. The mentoring follows a phased structure over nine months with gradually decreasing intensity:

Phase 1 (Months 1-3): Full-time intense training. Mentors engage with teachers for 40 hours per week (5 hours per day, 4 days per week). The first month focuses on group training sessions after classes and in-classroom support, where mentors introduce MHPSS tools and observe their application. In the second month, out-of-classroom support shifts toward personalized one-on-one discussions with individual teachers requiring additional assistance. The third month consolidates learning, with increased in-classroom support and brief individual check-ins to ensure all teachers can integrate the tools into their teaching.

Phase 2 (Months 4-6): Part-time mentorship and monitoring. Mentor support reduces to 10 hours per week, focused primarily on in-classroom observation, feedback, and guidance. The mentor's role evolves toward ensuring adoption and fostering teacher autonomy in implementing the MHPSS tools.

Phase 3 (Months 7-9): Light support and follow-up. Mentors conduct monthly school visits and weekly phone-based check-ins. This phase is designed to reinforce practices, troubleshoot implementation challenges, and prepare school staff to continue MHPSS activities independently after the program ends.

Component 2: Referral system for specialized support. Children identified as requiring additional support are referred to school psychologists for specialized services according to their needs. The school psychologist provides continuity in referral services, while the principal ensures institutional support for MHPSS integration beyond the program period.

Component 3: Activities recommendation tool (cross-randomized within treatment). A key implementation challenge is activity selection: teachers must eventually choose independently from the database of 62+ psychosocial activities, a task that field observations suggest is difficult given the breadth of options and heterogeneity of classroom contexts. To test whether data-driven guidance can improve activity-context matching, access to a technology platform is cross-randomized within treated schools.

The tool is a web application in which teachers input five classroom parameters (number of students, subject area, grade level, duration, and preferred activity type) and receive a ranked list of recommended activities. Rankings are based on a composite score weighting activity effectiveness (based on mentor evaluations), fit with the teacher's classroom parameters, and teacher-reported preferences from prior usage. Activity data are collected by mentors through structured journals administered via KoboToolbox, processed via Google Sheets, and served to the web application. All tool usage (logins, activities viewed, activities saved, and activities implemented) is tracked by the team.
Intervention Start Date
2025-03-03
Intervention End Date
2026-11-30

Primary Outcomes

Primary Outcomes (end points)
Mental health (stress, anxiety, depression)
Academic performance (math and Armenian language)
Primary Outcomes (explanation)
Mental health is measured using two complementary instruments. The first is the Depression, Anxiety, and Stress Scale for Youth (DASS-Y; Szabo and Lovibond, 2022), a 21-item scale adapted for the Armenian context and validated for children and adolescents aged 7–18. The instrument comprises three subscales of seven items each, measuring depression, anxiety, and stress separately. The second instrument is the Library of Universal Mental Health Instruments (LUMI), developed by the Child Mind Institute's Global Center for Child and Adolescent Mental Health. LUMI is a free, multilingual, and culturally sensitive assessment tool designed for individuals aged 3 to 24, offering reliable measurement across multiple mental health domains, adapted and validated for the Armenian context. We administer the anxiety, depression, and PTSD modules.

Academic performance is measured using grade-specific assessments in Armenian Language and Mathematics, developed specifically for this study by curriculum specialists in collaboration with the Ministry of Education.

We may additionally include clinical assessments as a third instrument, contingent on approval by the Ministry of Education. If approved, these assessments will be conducted by psychologists trained in the instrument by specialists from the Republican Pedagogical-Psychological Center (RPPC).

Academic performance is measured using grade-specific assessments in Armenian Language and Mathematics, developed specifically for this study by curriculum specialists in collaboration with the Ministry of Education.

DASS-Y and academic performance data will be collected at midline (after six months of the intervention) and endline (at the end of the nine months of the intervention). LUMI will be collected at endline only across the three waves. Clinical assessments, if approved, may be conducted at endline in waves 2 and 3.

Secondary Outcomes

Secondary Outcomes (end points)
Students: Non-cognitive (socio-emotional) skills; teacher support; peer environment; educational aspirations.
Teachers: Mental health, non-cognitive (socio-emotional) skills, mental health literacy, peer environment
Implementation fidelity: characteristics of psychosocial activities implemented, time on task, classroom culture, and socioemotional skills.
Secondary Outcomes (explanation)
Secondary outcomes reported by students are organized into four families:

- The first family captures non-cognitive (socio-emotional) skills through three measures: emotion regulation, self-efficacy, and grit.
- The second family captures teacher support as a dimension of school inputs, measured through the teacher support and innovation in practices and interactions with students.
- The third family captures the peer environment through four measures: peer interaction, social cohesion, peer integration; and loneliness (reverse-coded).
- The fourth family captures educational aspirations, measured through items on the highest level of education students expect to complete and their perceived importance of schooling for future opportunities.

All families of students' secondary outcomes will be collected at midline and endline. However, some of the measures in the families are going to be collected at midline only. All measures in the families will be collected at endline.

Secondary outcomes reported by teachers/educators are organized into five families:
- The first family captures mental health through four measures: stress, anxiety, depression, and burnout.
- The second family captures teacher mental health literacy through two measures: attitudes towards mental health, mental health awareness.
- The third family consists of teacher skills and behavior using three measures: self-efficacy, empathy, prosocial behavior)
- The fourth family captures teacher beliefs and mindsets through four measures: teacher bias, outcome accountability, fixed mindset, self-confidence
- And the fifth family captures peer environment through three measures: peer sensitivity, cultural pluralism, student-teacher interactions.

Implementation fidelity outcomes reported by mentors collected through classroom observation journals and are organized into the following families: characteristics of psychosocial activities implemented, time on task (time on learning), classroom culture (supportive learning environment, positive behavioral expectations), and socioemotional skills (autonomy, perseverance, social and collaborative skills).

Experimental Design

Experimental Design
We implement a staggered cluster randomized controlled trial across 450 public schools in Armenia, carried out in three waves of 150 schools each. Schools were eligible if they (i) teach students in grades 5 through 7, (ii) have at least one displaced student, and (iii) are classified as basic (grades 1–9) or secondary (grades 1–12). From 468 eligible schools, we selected 450 by ranking schools within regions and school categories based on their share of displaced students.

Within each wave, schools are randomly assigned with equal probability to a treatment group (75 schools per wave, 225 total) or a control group (75 schools per wave, 225 total), stratified by municipality. Treatment schools receive the full nine-month MHPSS program, consisting of mentor-led teacher training and in-classroom support combined with a referral system for specialized psychological services. Control schools do not receive any program activities during the intervention period but complete the same survey instruments following identical protocols.

Within treatment schools, we implement a second randomization to evaluate the activities recommendation tool. In each wave, half of the treatment schools are randomly assigned to receive the tool and the other half continues with the standard program without the tool, stratified by region and rurality (urban vs. rural). This cross-randomization allows us to estimate the marginal effect of data-driven activity selection on student outcomes, holding constant the core mentoring and referral components.
Experimental Design Details
Not available
Randomization Method
Randomization was conducted in office using code prepared by the researchers.
Randomization Unit
School
Was the treatment clustered?
Yes

Experiment Characteristics

Sample size: planned number of clusters
450 schools
Sample size: planned number of observations
30,100 students, 3,833 teachers
Sample size (or number of clusters) by treatment arms
Main randomization: 225 schools assigned to treatment (75 per wave) and 225 schools assigned to control (75 per wave). Approximately 15,050 students and 1,917 teachers per arm.
Within-treatment cross-randomization: in each wave, approximately half of the 75 treatment schools are assigned to receive the activities recommendation tool and the other half continues with the standard program without the tool. That is, 7,525 treated students in schools that received the tool and the intervention and 7,525 treated students in schools that did not receive the tool, only the intervention.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
See PAP
IRB

Institutional Review Boards (IRBs)

IRB Name
Harvard University
IRB Approval Date
2024-10-09
IRB Approval Number
IRB24-0773