The Effects of Nudging Summer Youth Employment Program Applicants to Apply to More Jobs

Last registered on April 13, 2023

Pre-Trial

Trial Information

General Information

Title
The Effects of Nudging Summer Youth Employment Program Applicants to Apply to More Jobs
RCT ID
AEARCTR-0011180
Initial registration date
April 01, 2023

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
April 13, 2023, 3:30 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Primary Investigator

Affiliation
Northeastern University

Other Primary Investigator(s)

PI Affiliation
Northeastern University
PI Affiliation
Northeastern University

Additional Trial Information

Status
In development
Start date
2023-04-01
End date
2023-12-31
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
Summer Youth Employment Programs (SYEP) have been shown to have significant impacts on youth outcomes such as reducing violent crime, increasing high school graduation, and boosting subsequent employment and wages. Much of this research is based on lotteries from oversubscribed programs. However, many cities find that it is not feasible to allocate youth to jobs using simple random assignment due to heterogeneous preferences from employers and youth participants. This is because the matching process is more complex so that the program needs to balance both youth and employer interests to ensure participation. Even when random assignment is used, two issues arise that may lead to inequitable outcomes. First some SYEPs conduct random assignment at the employer level as opposed to the program level and our prior research shows that this can harm BIPOC applicants when the distribution of applications to position is imbalanced by race. In addition, this racial inequity may be exacerbated if there is an imbalance between the number of applicants and the number of openings across employers (e.g., some jobs are over- versus under-subscribed), such that youth and/or employers may not get their first choice. Our prior research documents that in the extreme, where over half of youth apply to only one job, this labor market mismatch across applicants and jobs means that no match can be made for a significant fraction of youth.

To help address this problem, we propose an RCT where we test the effectiveness of an informational treatment to improve job application behavior among youth by giving them salient information about the successful applicants and helping them to reflect about their job search. Each week, we will randomize youth who have applied to less than three positions into a treatment and a control group, stratified by availability of text and parent email. The treatment group will receive an email and/or text message nudging them to apply to more jobs along with salient information about the success rate of applicants from the prior year. The control group will not receive any nudging communication. If the sample size is sufficient, we will further subdivide the treatment group with one arm receiving a subsequent 'reflection' treatment in the form of an optional Qualtrics survey to help youth identify which job positions meet their specified criteria from the remaining positions which are still seeking job candidates.
External Link(s)

Registration Citation

Citation
Hoover, Hanna, Mindy Marks and Alicia Modestino. 2023. "The Effects of Nudging Summer Youth Employment Program Applicants to Apply to More Jobs." AEA RCT Registry. April 13. https://doi.org/10.1257/rct.11180-1.0
Sponsors & Partners

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
We propose an RCT where we test the effectiveness of an informational treatment to improve job application behavior among youth by giving them salient information about the successful applicants and helping them to reflect about their job search. Each week, we will randomize youth who have applied to less than three positions into a treatment and a control group, stratified by availability of text and parent email. The treatment group will receive an email and/or text message nudging them to apply to more jobs along with salient information about the success rate of applicants from the prior year. The message will encourage youth in the treatment group to apply to at least 3 positions and to positions which have numerous openings, along with a bar chart showing the hiring rate of youth from the prior year who applied to 2 or less jobs versus those who applied to 3 or more jobs. The control group will not receive any nudging communication.

If the sample size is sufficient, we will further subdivide the treatment group with one arm receiving a subsequent 'reflection' treatment. This reflection message will only be sent to those in the treatment group who still have not applied to at least three jobs one week prior to the close of the application portal. The reflection message will encourage youth to take a Qualtrics survey to find jobs that match their interests and still have openings.



We will send informational emails to the treatment group. The treatment group will be youth who applied to the Boston SYEP within the prior week and have applied to 2 or less jobs. We will stratify across treatment and control groups by if the youth included a phone number, if the youth included their parents email address, and if the youth included their parent's phone number. If there are sufficient numbers within the treatment group, we will send a second email nudge to half of the initial treatment group. The other half of the treatment group will not receive the second nudge. Again, we still stratify selection into this second treatment group by having a phone number, parent email address, and parent's phone number. In this second informational experiment, we will tell youth to navigate to a Qualtrics survey. This survey will ask youth about which industries, occupations, and neighborhoods they prefer to work in. Once selected, we will suggest employer-partners to apply to who still have employment slots still available.

Intervention Start Date
2023-04-03
Intervention End Date
2023-06-30

Primary Outcomes

Primary Outcomes (end points)
Our primary outcomes include
Number of job applications youth submitted (total and whether >=3)
Number of job applications youth submitted by 2 weeks after initial email (total and if greater than three)
Whether youth applied to jobs with multiple openings (applied to any, how many applied to, average number of openings)
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Our secondary outcomes include:
Whether participated in Qualtrics reflection
Whether applied to under-subscribed jobs identified by the Qualtrics reflection
Whether the youth was ever selected for employment
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
Each week, we will collect data from the application portal and randomize youth who have applied to less than three positions into a treatment and a control group, stratified by availability of text and parent email. If sufficient sample size allows, we will also use the same randomization procedures on April 24th to randomize those in the treatment group who still have not applied to at least three jobs into whether they will receive the follow-up reflection message and Qualtrics survey.
Experimental Design Details
These applicants come from the 2023 City of Boston SuccessLink program portal.
Randomization Method
Randomization will be done using stata statistical software.
Randomization Unit
Randomization will occur at the individual level each wekk.
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
0
Sample size: planned number of observations
2000
Sample size (or number of clusters) by treatment arms
Control group=1,000
Treatment group = 1,000
Arm 1 of Reflection Treatment = 300
Arm 2 of Reflection Treatment = 300
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
Northeastern University Institutional Review Board
IRB Approval Date
2020-01-11
IRB Approval Number
IRB #: 20-01-10

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials