Perceived Ability and School Choices

Last registered on December 17, 2018

Pre-Trial

Trial Information

General Information

Title
Perceived Ability and School Choices
RCT ID
AEARCTR-0003429
Initial registration date
October 16, 2018

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
October 22, 2018, 12:55 AM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
December 17, 2018, 5:51 AM EST

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Region

Primary Investigator

Affiliation
Inter-American Development Bank

Other Primary Investigator(s)

PI Affiliation
University of Toulouse Capitole

Additional Trial Information

Status
On going
Start date
2014-02-15
End date
2019-12-31
Secondary IDs
Abstract
This projects aims to study the role of youth's self perceptions of ability on their sorting patterns across schools. We design and implement a field experiment in which ninth graders from less advantaged backgrounds in Mexico are provided with individualized feedback about their performance on an achievement test. We then look at the effect of the informational intervention on school choices and schooling outcomes at the end of high school.
External Link(s)

Registration Citation

Citation
Bobba, Matteo and Veronica Frisancho. 2018. "Perceived Ability and School Choices." AEA RCT Registry. December 17. https://doi.org/10.1257/rct.3429-2.0
Former Citation
Bobba, Matteo and Veronica Frisancho. 2018. "Perceived Ability and School Choices." AEA RCT Registry. December 17. https://www.socialscienceregistry.org/trials/3429/history/38980
Sponsors & Partners

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
We design and implement a field experiment that provides students from disadvantaged backgrounds with individualized feedback on their academic performance during the transition from middle to high school.
Intervention (Hidden)
The research design is nested within a large-scale assignment mechanism that allocates students across high-school programs in Mexico City according to applicants' school rankings and performance on an achievement test. We administer a mock version of the actual test, communicate individual scores to a randomly chosen subset of subjects, and elicit probabilistic statements about performance beliefs in the actual test using bean counts. Such task, a priori, appears challenging yet our approach turns out to be intuitive and accessible for the age group that the intervention targets. We further show that, in our setting, the score in the mock exam provides students with a signal that is easy to interpret and contains relevant information about their academic potential. The design of the field experiment also includes a pure control group of applicants who do not take the mock test. This group allows us distinguishing between the effects of taking the test and receiving performance feedback.
Intervention Start Date
2014-02-15
Intervention End Date
2014-03-31

Primary Outcomes

Primary Outcomes (end points)
school choices (school preferences), enrollment, and graduation on time
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
participation in assignment mechanism, placement in school alternative, score in the admission exam, size of school rankings
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
Among a restricted universe of schools in disadvantaged neighborhoods, we randomize the treatment within 12 strata, as defined by the geographic location of the school and its academic performance in the national performance evaluation in 2012. In the treatment group (n=44 schools) we administered the mock exam and provide face-to-face feedback on performance. A second arm, the placebo group (n=46 schools), took the mock exam but did not get information about test results. Finally, we also included a pure control group (n=28 schools).
Experimental Design Details
We impose two restrictions to select the experimental sample from the universe of potential COMIPEMS applicants. First, we focus on schools with a considerable mass of applicants in 2012 (more than 30). Second, we consider schools located in neighborhoods with high or very high poverty levels (according to the National Population Council in 2010). Students in these areas are less likely to have access to previous informative signals about their own academic potential in general, and about their performance in the COMIPEMS exam in particular. Indeed, data from the 2012 edition of the assignment system shows that, on average, 44 percentage of the applicants in schools located in more affluent neighborhoods took preparatory courses before submitting their school rankings but this figure drops to 12 percent among schools in high poverty areas. Among the applicants in our sample, 16 percent report previous exposure to a mock test of the admission exam with performance feedback.

Schools that comply with the criteria imposed are grouped into four geographic regions and terciles of school average performance amongst ninth graders in a national standardized test aimed at measuring academic achievement (ENLACE, 2012). Treatment assignment is randomized within strata at the school level. As a result, 44 schools are assigned to a treatment group in which we administer the mock exam and provide face-to-face feedback on performance, 46 schools are assigned to a "placebo"group in which we only administer the mock exam, without providing information about the test results, and 28 schools constitute a control group. Within each school in the experimental sample, we randomly pick one ninth grade classroom to participate in the experiment.

Randomization Method
Randomization done in office by a computer
Randomization Unit
School
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
118
Sample size: planned number of observations
3,644
Sample size (or number of clusters) by treatment arms
44 treatment schools, 46 placebo schools, 28 control schools
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
IRB Approval Date
IRB Approval Number

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials