Back to History Current Version

Nimble evaluations of an online entrepreneurship and STEM training program in schools

Last registered on November 17, 2020

Pre-Trial

Trial Information

General Information

Title
Nimble evaluations of an online entrepreneurship and STEM training program in schools
RCT ID
AEARCTR-0003553
Initial registration date
November 20, 2018

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
November 20, 2018, 4:09 PM EST

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
November 17, 2020, 2:28 AM EST

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Region

Primary Investigator

Affiliation
World Bank

Other Primary Investigator(s)

PI Affiliation
University of Kassel

Additional Trial Information

Status
In development
Start date
2019-02-01
End date
2020-12-01
Secondary IDs
Abstract
Opportunity-focused, high-growth entrepreneurship and science-led innovation are crucial for continued economic growth and productivity. Working in these fields offers the opportunity for rewarding and high-paying careers. However, the majority of youth in developing countries do not consider either as job options, affecting their choices of what to study. Youth may not select these educational and career paths due to lack of knowledge, lack of appropriate skills, and lack of role models. We are working to provide a scalable approach to overcoming these constraints through an online education course for secondary school students that covers entrepreneurial soft skills, scientific methods, and interviews with role models. This course will be taken by students during class time, under teacher supervision.
There are then three policy problems that we aim to use nimble evaluations to provide evidence on:
1) Maximizing take-up: How can the Ministry of Education maximize take-up of the program in selected schools, ensuring that teachers use it in their classes?
2) Content ordering: Is it better to try to inspire students first through interviews with role models, or to first build their knowledge and skills so that they connect better with role models?
3) Standardized or Adaptive Learning: Should the same assessment tools be used for all students, or are learning gains higher with exercises that adapt to the interests and abilities of students?

External Link(s)

Registration Citation

Citation
Asanov, Igor and David McKenzie. 2020. "Nimble evaluations of an online entrepreneurship and STEM training program in schools." AEA RCT Registry. November 17. https://doi.org/10.1257/rct.3553-3.0
Former Citation
Asanov, Igor and David McKenzie. 2020. "Nimble evaluations of an online entrepreneurship and STEM training program in schools." AEA RCT Registry. November 17. https://www.socialscienceregistry.org/trials/3553/history/197995
Sponsors & Partners

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
The online course will be taught in schools over 12 weeks in 2019, and will contain the following content: (1) entrepreneurship-related soft skills – this will cover personal initiative and negotiation skills; (2) basic scientific methods – this will cover ideas of the experimental method, and basic statistics; and (3) role model interviews with entrepreneurs and scientists. Placebo courses will be offered to a control group for measuring the overall effectiveness of these courses.

The following nimble evaluation interventions will be undertaken to improve implementation fidelity and effectiveness of these courses:
1) Interventions to maximize teacher take-up: the Ministry of Education plans to train the teachers online about the new course content. We will test whether in-person training is cost-effective in delivering better compliance and usage. We will also test whether providing benchmarking information on take-up in other schools leads to higher usage.
2) Varying content ordering: we will randomize the ordering of modules to learn whether it is better to introduce students to role models before or after they get the skill content.
3) Introducing adaptive-learning exercises: the standard course will have the same content and exercises for all students. We will test whether student learning gains are higher if exercises are adapted to student skills and interests.
Intervention Start Date
2019-05-01
Intervention End Date
2020-07-15

Primary Outcomes

Primary Outcomes (end points)
Active take-up: Hours per student actively spent on the platform
Career attitudes: interest in entrepreneurial and scientific careers
Learning: Students’ performance on subject-specific tests
Steps towards career: Action based outcomes.
Primary Outcomes (explanation)
Active take-up:
Detailed Description: Administrative data from the online monitoring system on how many days students log onto the material, how long they actively spend on the material each day, and on the percent of students in the class that participate.
Career attitudes: interest in entrepreneurial and scientific careers
Learning: Students’ performance on subject-specific tests

Detailed Description: Subject matter specific tests will be designed to measure student knowledge on business, science, math, and literacy outcomes. These tests will be administered through the online system, and be given to both treatment and control schools
Steps towards career: Action based outcomes.

Detailed Description: We will advertise set of career-related activities to see whether students exert effort to find out more about entrepreneurship and science career choices. This will include: (1) administrative data on whether students send SMS messages to request information in response to posters placed in schools; (2) administrative data on whether students participate in an interview challenge, where they are asked to interview an entrepreneur and scientist.


Detailed Description: Students will be administrated an online questionnaire to measure career intentions and self-efficacy level.(Using content from OCTOSKILL and others.)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
The three nimble evaluations will have randomization at different levels:
1) Teacher take-up: we will cross-randomize two treatments at the school level: (i) whether teachers are trained online about the new program, or in-person; and (ii) whether teachers receive weekly benchmarking information on take-up and usage in other schools.
2) Content-ordering: classes in the treatment schools will be randomized at the class level as to whether they receive the role model content at the start of the intervention or at the end.
3) Adaptive or non-adaptive exercises: individuals in the treated schools will be randomized at the individual level as to whether they receive standard exercises which are the same for everyone, or to whether they get adaptive-exercises which vary depending on initial performance and student interests.
Experimental Design Details
Randomization Method
Randomization to be done in office by computer
Randomization Unit
There are three different levels of randomization:
- The take-up interventions will be randomized at the school-level.
- The content ordering interventions will be randomized at the class-level
- The adaptive or non-adaptive exercise interventions will be randomized at the individual level
Was the treatment clustered?
Yes

Experiment Characteristics

Sample size: planned number of clusters
We plan to work with 110 schools. These will be randomized into 70 treatment schools and 40 control schools. Across these 110 schools we will carry out a 2x2 randomization into whether training is in person or not, and whether benchmarking is provided or not.
Within the 70 treatment schools, there will be approximately 350 classes. We will randomize the ordering of topics in these classes.
The adaptive interventions are not clustered, but randomized at the individual level.
Sample size: planned number of observations
We anticipate 110 schools, and will work with levels 10 and 11 in these schools, for 220 grades. There are approximately 5 classes per school, so approximately 550 classes in total, and approximately 180 students per school, for aproximately 20,000 students in total.
Sample size (or number of clusters) by treatment arms
Intervention 1: 55 schools in-person training, 55 schools online training
55 schools with benchmarking, 55 schools no benchmarking
Intervention 2: approximately 175 classes with role model content at the start of the course, approximately 175 classes with role model at the end
Intervention 3: approximately 10,000 students with adaptive exercises, 10,000 with non-adaptive.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Intervention 1: We assume control mean of students completing 12 out of 18 hours of on-line content, with the standard deviation of 2 hours, and ICC of 0.5. Then MDE is 0.34s.d. (R2=0.2) to 0.38s.d. (R2=0), which is 0.7 to 0.8 hours more content covered. Intervention 2: for career attitudes, we assume a baseline mean of 0.18, baseline standard deviation of 0.38, and ICC of 0.01. Then MDE is 0.06s.d., which is a 0.02 change in the Octoskill measure Intervention 3: for the subject-specific test, we assume a baseline mean of 713, the standard deviation of 119, and autocorrelation of 0.5 (baseline scores will help predict follow-up) Then MDE is 6 points on the test (0.05 SD).
IRB

Institutional Review Boards (IRBs)

IRB Name
Comité de Ética de Investigación en Seres Humanos Universidad San Francisco de Quito
IRB Approval Date
2018-12-21
IRB Approval Number
2018-208E
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
Yes
Intervention Completion Date
May 30, 2020, 12:00 +00:00
Data Collection Complete
Yes
Data Collection Completion Date
July 31, 2020, 12:00 +00:00
Final Sample Size: Number of Clusters (Unit of Randomization)
We worked in total with 108 schools, containing 560 classes, and 15443 students.
[Note sample sizes and registry entry refer to benchmarking treatment, which is published]
Was attrition correlated with treatment status?
No
Final Sample Size: Total Number of Observations
12,232 students reached the endline survey in July 2020
Final Sample Size (or Number of Clusters) by Treatment Arms
54 schools to benchmarking, 54 to control
Data Publication

Data Publication

Is public data available?
Yes
Public Data URL

Program Files

Program Files
Yes
Program Files URL
Reports, Papers & Other Materials

Relevant Paper(s)

Abstract
Many school systems across the globe turned to online education during the COVID-19 pandemic. This context differs significantly from the prepandemic situation in which massive open online courses attracted large numbers of voluntary learners who struggled with completion. Students who are provided online courses by their high schools also have their behavior determined by actions of their teachers and school system. We conducted experiments to improve participation in online learning before, during, and right after the COVID-19 outbreak, with 1,151 schools covering more than 45,000 students in their final years of high school in Ecuador. These experiments tested light-touch interventions at scale, motivated by behavioral science, and were carried out at three levels: that of the system, teacher, and student. We find the largest impacts come from intervening at the system level. A cheap, online learning management system for centralized monitoring increased participation by 0.21 SD and subject knowledge by 0.13 SD relative to decentralized management. Centralized management is particularly effective for underperforming schools. Teacher-level nudges in the form of benchmarking emails, encouragement messages, and administrative reminders did not improve student participation. There was no significant impact of encouragement messages to students, or in having them plan and team-up with peers. Small financial incentives in the form of lottery prizes for finishing lessons did increase study time, but was less cost-effective, and had no significant impact on knowledge. The results show the difficulty in incentivizing online learning at scale, and a key role for central monitoring.
Citation
Igor Asanov, Anastasiya-Mariya Asanov, Thomas Astebro, Guido Buenstorf, Bruno Crepon, Francisco Pablo Flores T, David McKenzie, Mona Mensmann and Mathis Schulte (2023) "System-, teacher-, and student-level interventions for improving participation in online learning at scale in high schools", PNAS 120(3): e2216686120, July 17

Reports & Other Materials