The Impact of Unpredictable Work Schedules on Hourly Workers’ Ability to Predict Their Short-Term Liquidity Demand: Evidence from an Earned Wage Access Fintech Experiment

Last registered on December 31, 2022

Pre-Trial

Trial Information

General Information

Title
The Impact of Unpredictable Work Schedules on Hourly Workers’ Ability to Predict Their Short-Term Liquidity Demand: Evidence from an Earned Wage Access Fintech Experiment
RCT ID
AEARCTR-0009362
Initial registration date
May 06, 2022

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
May 09, 2022, 8:32 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
December 31, 2022, 12:23 PM EST

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Primary Investigator

Affiliation
University of California, Berkeley

Other Primary Investigator(s)

PI Affiliation
University of California, Berkeley
PI Affiliation
University of California, Berkeley

Additional Trial Information

Status
In development
Start date
2022-05-13
End date
2023-05-15
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
We partner with an Earned Wage Access (EWA) fintech company to run a survey-based experiment through their app. EWA technology gives workers access to their wages as they are earned, rather than having to wait until payday. EWA serves as an alternative to high-interest forms of short-term credit (e.g., payday loans) and predominantly caters to lower-wage, hourly workers.

Our experiment seeks to answer three questions: (1) Can workers accurately predict how many hours they will work in the future? (2) Can workers accurately predict how much pay they will advance in the future? (3) How much of a role, if any, does misprediction of work hours play in misprediction of pay advanced? Our survey collects workers’ beliefs about work hours and pay advances. Using workers’ actual hours and pay advances, we can answer the first two questions. We develop and employ two treatments in order to answer the third.
External Link(s)

Registration Citation

Citation
Cefala, Luisa, Eric Koepcke and Nicholas Swanson. 2022. "The Impact of Unpredictable Work Schedules on Hourly Workers’ Ability to Predict Their Short-Term Liquidity Demand: Evidence from an Earned Wage Access Fintech Experiment." AEA RCT Registry. December 31. https://doi.org/10.1257/rct.9362-1.2
Sponsors & Partners

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
We run a survey-based experiment. An invitation to our survey is sent through the fintech company’s app.

Survey:
1. The survey collects workers’ beliefs about how many hours they'll work during their next pay period.
2. The survey collects workers’ beliefs about how much pay they'll have advanced to them during their next pay period.
3. In the vein of Allcott et al. (forthcoming), we introduce an incentive to advance less pay during the next pay period. We collect workers' beliefs about pay advances both without and with this incentive. We also use a multiple-price list (MPL) to measure workers’ valuation of this incentive.

The app's administrative data records workers actual work hours and pay advances during their next pay period.
Intervention Start Date
2022-05-13
Intervention End Date
2022-06-15

Primary Outcomes

Primary Outcomes (end points)
1. Beliefs: For future work hours, we are interested in how the information treatment affects workers’ hours beliefs.

For future pay advances, we are interested in the effect the information treatment has on workers’ pay advance beliefs. For the fixed hours treatment, we are interested in how their pay advance beliefs differ under different work hours outcomes, as well as how they differ relative to the control and information treatment groups.

2. Misprediction: Using the administrative data, we can measure how accurate workers’ hours and pay advance beliefs are on average. This can be done for each treatment group.

In addition to misprediction within each treatment group, we are interested in the differences in misprediction between each treatment group and under different work hours outcomes for the fixed hours treatment.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
-'Effect' of Incentive to Advance Less Pay: Within each treatment group, we are interested in comparing workers’ pay advance beliefs without the incentive to those with the incentive, i.e., the perceived change in future advanced pay the incentive would lead to. We are also interested in how much workers’ value this incentive. We will look at these outcomes within each treatment, as well as between treatments and under different work hours outcomes for the fixed hours treatment.
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
We have three treatment groups in our experiment:
1. A control group is just given the survey described above.
2. An information treatment group is provided with information before they report their work hours and pay advance beliefs.
3. A fixed hours treatment group is asked to report their pay advance beliefs under different work hours outcomes.
Experimental Design Details
Information treatment: This group is given information on the average amount (for the app’s users) that workers’ hours vary from month-to-month. Using this information, they are informed how much wages vary from month-to-month on average (due to hours variability). These two pieces of information are given to them before they report their beliefs about future work hours and pay advances, respectively.

Fixed Hours treatment: For this group, we take a worker’s self-reported max and min number of hours/week they think they will work and split that range in half, i.e., into two hours bins. When collecting beliefs about their future pay advances, we ask them to report beliefs under the assumption that their work hours next pay period fall within each of the two hours bins (thus, they report two sets of beliefs about future pay advances).

Randomization: The computer randomizes each respondent into one of the three groups: 30% control, 30% information treatment, 40% fixed hours treatment.
Randomization Method
Randomization done by computer.
Randomization Unit
Individual
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
3,000-6,000 individuals.
Sample size: planned number of observations
3,000-6,000 individuals.
Sample size (or number of clusters) by treatment arms
1,000-2,000 individuals in the control group and each of the two treatment groups.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
University of California, Berkeley IRB
IRB Approval Date
2021-03-22
IRB Approval Number
2021-02-14026
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials