Audit Study on the Returns to Remote Work

Last registered on September 02, 2024

Pre-Trial

Trial Information

General Information

Title
Audit Study on the Returns to Remote Work
RCT ID
AEARCTR-0012465
Initial registration date
March 24, 2024

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
April 02, 2024, 10:45 AM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
September 02, 2024, 1:54 PM EDT

Last updated is the most recent time when changes to the trial's registration were published.

Locations

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Primary Investigator

Affiliation
University of Southern California

Other Primary Investigator(s)

Additional Trial Information

Status
In development
Start date
2024-01-01
End date
2026-01-01
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
This study implements a randomized correspondence study to estimate the callback rate and wage differential for remote work by sex, race, education, experience, and geography. To identify causal estimates of the returns (positive or negative) to remote workers, I create fictitious worker profiles on an online job board where the demographics, qualifications, and preferences for remote work of each worker are randomly selected. I plan to test whether job seekers who prefer working from home receive fewer interviews and are paid less on average. In addition, I am interested in how the returns to working from home varies by worker and firm characteristics, particularly whether remote work creates job opportunities for women and for workers in less populated areas.
External Link(s)

Registration Citation

Citation
Quach, Simon. 2024. "Audit Study on the Returns to Remote Work." AEA RCT Registry. September 02. https://doi.org/10.1257/rct.12465-2.0
Experimental Details

Interventions

Intervention(s)
Intervention Start Date
2024-04-01
Intervention End Date
2026-01-01

Primary Outcomes

Primary Outcomes (end points)
The key outcomes are 1) the total number of interview requests received by each candidate profile and 2) the salary offered in the callbacks.

I am also interested in the composition of firms that offer the interviews. Specifically, I will count the number of interviews from large vs. small firms, local vs. out-of-state firms, high paying vs. low paying firms, firms with remote programs vs. those without, FAANG vs. non-FAANG, and firms with large vs. small share of employees who are women.

To reframe the difference in callback rates in dollar values, I also measure the amount by which remote candidates need to reduce their salary expectations in order to receive the same number of interviews as in-office candidates.
Primary Outcomes (explanation)
To measure the "wage differential" associated with remote work, I apply an Oaxaca-Blinder type decomposition. I jointly estimate the causal impacts of asking for a higher salary and asking for a remote job on the number of interview requests. The ratio of these two estimates will provide a measure of how much expected wages need to fall by for remote workers to receive the same number of interviews as in-office workers. Similar to an Oaxaca-Blinder decomposition, I will test the robustness of the estimation to allowing for differences in slopes between remote and non-remote workers.

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
Will be made public when the trial is complete.
Experimental Design Details
Not available
Randomization Method
All resume characteristics will be randomly assigned by computer.
Randomization Unit
Individual profile
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
~10000 unique profiles, each active twice
Sample size: planned number of observations
~10000 unique profiles
Sample size (or number of clusters) by treatment arms
~40% remote, 40% in-office, and 20% hybrid. Results are clustered by individual profile.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Assumptions: 1) Mean number of interviews per profile is 4.5. This is drawn from a previous report by the web platform. 2) From previous audit studies, the mean callback rate from sending a resume to an employer is 12.5%. 3) Inferring from previous audit studies, the standard deviation in callback rates is about 0.15. 4) Assume that (Number of interviews) = (Number of firms that view profile) * callback rate Input (1) and (2) into (4) implies 4.5=0.125 * Number of firms So the number of firms that view each profile is 36. For the power calculation, I need the standard deviation in the number of interviews per profile. SD(Interviews)=SD(NumFirms*Callback) = 36*SD(Callback) = 5.4 Given the mean and standard deviation, the minimum detectable effect on "number of interviews received" is a difference of 0.353 interviews with a sample of 10,000 profiles.
Supporting Documents and Materials

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
IRB

Institutional Review Boards (IRBs)

IRB Name
USC Institutional Review Board
IRB Approval Date
2023-09-18
IRB Approval Number
UP-23-00905