Documenting and Understanding the Invisible Load

Last registered on September 01, 2025

Pre-Trial

Trial Information

General Information

Title
Documenting and Understanding the Invisible Load
RCT ID
AEARCTR-0016624
Initial registration date
August 26, 2025

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
September 01, 2025, 3:19 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Primary Investigator

Affiliation
Tufts University

Other Primary Investigator(s)

PI Affiliation
Brigham Young University
PI Affiliation
Syracuse University

Additional Trial Information

Status
On going
Start date
2025-08-26
End date
2027-12-31
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
We study how childcare decision-makers respond to email inquiries from parents. In a US field experiment with daycare centers, we send an email that varies (i) whether the message frames the situation as an exogenous emergency (“we unexpectedly moved”) or as an endogenous oversight (“we dropped the ball”), and (ii) whether the sender is a mother or a father (by using a male or female sounding name). We measure if and how centers respond (reply rate and speed), whether they offer an appointment or a spot (and timing), and the tone/helpfulness of replies. Some centers are randomly assigned to receive a neutral follow-up five days after the initial email if no reply has been received.
External Link(s)

Registration Citation

Citation
Buzard, Kristy, Laura Gee and Olga Stoddard. 2025. "Documenting and Understanding the Invisible Load." AEA RCT Registry. September 01. https://doi.org/10.1257/rct.16624-1.0
Experimental Details

Interventions

Intervention(s)
In the interest of preserving the integrity of the experiment, the intervention is being described in full in the experimental design sections that will remain hidden until the experiment is completed.
Intervention Start Date
2025-09-01
Intervention End Date
2027-12-31

Primary Outcomes

Primary Outcomes (end points)
Reply received (0/1).
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
· ReplyLatency: Continuous measure of time to reply (in hours or days) from when the email was sent. This is the primary definition for latency and will be used in all main analyses.
· Tone / sentiment of response.
· Length of reply (word/character count).
. Content: Measures based on the content of the response (e.g. offer to schedule a tour/appointment, provision of alternative options or helpful resources, etc.
Note on robustness: While the primary definition of ReplyLatency is continuous, for robustness and interpretability we may also categorize response time into bins (e.g., within 1 day, within 7 days, or before/after the 5-day encouragement follow-up).
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
In the interest of preserving the integrity of the experiment, the experimental design is being described in full in the experimental design sections that will remain hidden until the experiment is completed.
Experimental Design Details
Not available
Randomization Method
Randomization done by computer. Stratification ensures balance across demographic attributes of the centers like provider type, region, and other characteristics.
Randomization Unit
Daycare center.
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
We plan to include approximately 2,000 daycare centers in the pilot study. Each daycare center represents one cluster, as responses are measured at the center level. Following the pilot, we will update our power calculations based on the observed effect sizes.
Sample size: planned number of observations
Because each daycare center is contacted only once (with a follow-up in the encouragement arm if no response), the total number of observations will also be 2,000 daycare centers for the primary outcomes in the pilot. Each observation corresponds to one unique center’s response (or non-response) to our email. The secondary outcomes (e.g. sentiment of the email response) will be determined by the response rate to our various treatments.
Sample size (or number of clusters) by treatment arms
We are running a 2×2 factorial design with four treatment arms with equal size samples in each treatment:
T1_M: “Unexpected move” message, male sender
T1_F: “Unexpected move” message, female sender
T2_M: “Dropped the ball” message, male sender
T2_F: “Dropped the ball” message, female sender
We plan to have 2,000 centers in the pilot.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Binary outcomes (reply yes/no): With n=500 centers in each treatment arm (four arms total, N=2000), a two-sided α=0.05 and 80% power, the Minimum Detectable Effect (MDE) for arm-to-arm comparisons is approximately 7–9 percentage points depending on the true baseline reply rate For example, if the baseline reply rate is 20%, the MDE is ~7.1pp; if 30%, ~8.1pp; if 40%, ~8.7pp; if 50%, ~8.9pp. These minimum detectable effects are appropriate for a pilot; the full-scale experiment will be powered to detect smaller effects once expanded.
IRB

Institutional Review Boards (IRBs)

IRB Name
Tufts SBER IRB
IRB Approval Date
2021-04-21
IRB Approval Number
STUDY00001527
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information