Nimble evaluations of an online entrepreneurship and STEM training program in schools
Last registered on February 12, 2019

Pre-Trial

Trial Information
General Information
Title
Nimble evaluations of an online entrepreneurship and STEM training program in schools
RCT ID
AEARCTR-0003553
Initial registration date
November 20, 2018
Last updated
February 12, 2019 9:34 PM EST
Location(s)

This section is unavailable to the public. Use the button below to request access to this information.

Request Information
Primary Investigator
Affiliation
World Bank
Other Primary Investigator(s)
PI Affiliation
University of Kassel
Additional Trial Information
Status
In development
Start date
2019-02-01
End date
2020-12-01
Secondary IDs
Abstract
Opportunity-focused, high-growth entrepreneurship and science-led innovation are crucial for continued economic growth and productivity. Working in these fields offers the opportunity for rewarding and high-paying careers. However, the majority of youth in developing countries do not consider either as job options, affecting their choices of what to study. Youth may not select these educational and career paths due to lack of knowledge, lack of appropriate skills, and lack of role models. We are working to provide a scalable approach to overcoming these constraints through an online education course for secondary school students that covers entrepreneurial soft skills, scientific methods, and interviews with role models. This course will be taken by students during class time, under teacher supervision.
There are then three policy problems that we aim to use nimble evaluations to provide evidence on:
1) Maximizing take-up: How can the Ministry of Education maximize take-up of the program in selected schools, ensuring that teachers use it in their classes?
2) Content ordering: Is it better to try to inspire students first through interviews with role models, or to first build their knowledge and skills so that they connect better with role models?
3) Standardized or Adaptive Learning: Should the same assessment tools be used for all students, or are learning gains higher with exercises that adapt to the interests and abilities of students?

External Link(s)
Registration Citation
Citation
Asanov, Igor and David McKenzie. 2019. "Nimble evaluations of an online entrepreneurship and STEM training program in schools." AEA RCT Registry. February 12. https://doi.org/10.1257/rct.3553-2.0.
Former Citation
Asanov, Igor, David McKenzie and David McKenzie. 2019. "Nimble evaluations of an online entrepreneurship and STEM training program in schools." AEA RCT Registry. February 12. http://www.socialscienceregistry.org/trials/3553/history/41458.
Sponsors & Partners

There are documents in this trial unavailable to the public. Use the button below to request access to this information.

Request Information
Experimental Details
Interventions
Intervention(s)
The online course will be taught in schools over 12 weeks in 2019, and will contain the following content: (1) entrepreneurship-related soft skills – this will cover personal initiative and negotiation skills; (2) basic scientific methods – this will cover ideas of the experimental method, and basic statistics; and (3) role model interviews with entrepreneurs and scientists. Placebo courses will be offered to a control group for measuring the overall effectiveness of these courses.

The following nimble evaluation interventions will be undertaken to improve implementation fidelity and effectiveness of these courses:
1) Interventions to maximize teacher take-up: the Ministry of Education plans to train the teachers online about the new course content. We will test whether in-person training is cost-effective in delivering better compliance and usage. We will also test whether providing benchmarking information on take-up in other schools leads to higher usage.
2) Varying content ordering: we will randomize the ordering of modules to learn whether it is better to introduce students to role models before or after they get the skill content.
3) Introducing adaptive-learning exercises: the standard course will have the same content and exercises for all students. We will test whether student learning gains are higher if exercises are adapted to student skills and interests.
Intervention Start Date
2019-05-01
Intervention End Date
2019-11-30
Primary Outcomes
Primary Outcomes (end points)
Active take-up: Hours per student actively spent on the platform
Career attitudes: interest in entrepreneurial and scientific careers
Learning: Students’ performance on subject-specific tests
Steps towards career: Action based outcomes.
Primary Outcomes (explanation)
Active take-up:
Detailed Description: Administrative data from the online monitoring system on how many days students log onto the material, how long they actively spend on the material each day, and on the percent of students in the class that participate.
Career attitudes: interest in entrepreneurial and scientific careers
Learning: Students’ performance on subject-specific tests

Detailed Description: Subject matter specific tests will be designed to measure student knowledge on business, science, math, and literacy outcomes. These tests will be administered through the online system, and be given to both treatment and control schools
Steps towards career: Action based outcomes.

Detailed Description: We will advertise set of career-related activities to see whether students exert effort to find out more about entrepreneurship and science career choices. This will include: (1) administrative data on whether students send SMS messages to request information in response to posters placed in schools; (2) administrative data on whether students participate in an interview challenge, where they are asked to interview an entrepreneur and scientist.


Detailed Description: Students will be administrated an online questionnaire to measure career intentions and self-efficacy level.(Using content from OCTOSKILL and others.)

Secondary Outcomes
Secondary Outcomes (end points)
Secondary Outcomes (explanation)
Experimental Design
Experimental Design
The three nimble evaluations will have randomization at different levels:
1) Teacher take-up: we will cross-randomize two treatments at the school level: (i) whether teachers are trained online about the new program, or in-person; and (ii) whether teachers receive weekly benchmarking information on take-up and usage in other schools.
2) Content-ordering: classes in the treatment schools will be randomized at the class level as to whether they receive the role model content at the start of the intervention or at the end.
3) Adaptive or non-adaptive exercises: individuals in the treated schools will be randomized at the individual level as to whether they receive standard exercises which are the same for everyone, or to whether they get adaptive-exercises which vary depending on initial performance and student interests.
Experimental Design Details
Not available
Randomization Method
Randomization to be done in office by computer
Randomization Unit
There are three different levels of randomization:
- The take-up interventions will be randomized at the school-level.
- The content ordering interventions will be randomized at the class-level
- The adaptive or non-adaptive exercise interventions will be randomized at the individual level
Was the treatment clustered?
Yes
Experiment Characteristics
Sample size: planned number of clusters
We plan to work with 110 schools. These will be randomized into 70 treatment schools and 40 control schools. Across these 110 schools we will carry out a 2x2 randomization into whether training is in person or not, and whether benchmarking is provided or not.
Within the 70 treatment schools, there will be approximately 350 classes. We will randomize the ordering of topics in these classes.
The adaptive interventions are not clustered, but randomized at the individual level.
Sample size: planned number of observations
We anticipate 110 schools, and will work with levels 10 and 11 in these schools, for 220 grades. There are approximately 5 classes per school, so approximately 550 classes in total, and approximately 180 students per school, for aproximately 20,000 students in total.
Sample size (or number of clusters) by treatment arms
Intervention 1: 55 schools in-person training, 55 schools online training
55 schools with benchmarking, 55 schools no benchmarking
Intervention 2: approximately 175 classes with role model content at the start of the course, approximately 175 classes with role model at the end
Intervention 3: approximately 10,000 students with adaptive exercises, 10,000 with non-adaptive.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Intervention 1: We assume control mean of students completing 12 out of 18 hours of on-line content, with the standard deviation of 2 hours, and ICC of 0.5. Then MDE is 0.34s.d. (R2=0.2) to 0.38s.d. (R2=0), which is 0.7 to 0.8 hours more content covered. Intervention 2: for career attitudes, we assume a baseline mean of 0.18, baseline standard deviation of 0.38, and ICC of 0.01. Then MDE is 0.06s.d., which is a 0.02 change in the Octoskill measure Intervention 3: for the subject-specific test, we assume a baseline mean of 713, the standard deviation of 119, and autocorrelation of 0.5 (baseline scores will help predict follow-up) Then MDE is 6 points on the test (0.05 SD).
IRB
INSTITUTIONAL REVIEW BOARDS (IRBs)
IRB Name
Comité de Ética de Investigación en Seres Humanos Universidad San Francisco de Quito
IRB Approval Date
2018-12-21
IRB Approval Number
2018-208E