Increasing Access to Minnesota Child Care Stabilization Base Grants

Last registered on May 17, 2023

Pre-Trial

Trial Information

General Information

Title
Increasing Access to Minnesota Child Care Stabilization Base Grants
RCT ID
AEARCTR-0011435
Initial registration date
May 16, 2023

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
May 17, 2023, 3:02 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region

Primary Investigator

Affiliation
Office of Evaluation Sciences, U.S. General Services Administration

Other Primary Investigator(s)

PI Affiliation
Office of Evaluation Sciences, U.S. General Services Administration
PI Affiliation
Office of Evaluation Sciences, U.S. General Services Administration
PI Affiliation
Office of Evaluation Sciences, U.S. General Services Administration
PI Affiliation
Office of Evaluation Sciences, U.S. General Services Administration
PI Affiliation
Minnesota Office of Management and Budget
PI Affiliation
Minnesota Department of Human Services

Additional Trial Information

Status
On going
Start date
2023-02-10
End date
2024-09-01
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
The pandemic highlighted the instability of the child care market and put additional financial burdens on child care providers. The American Rescue Plan Act of 2021 (ARP) allocated approximately $24 billion for child care stabilization grants that Health and Human Services (HHS) Administration for Children and Families (ACF), working with states, territories, and tribes, provides as subgrants to child care providers. Minnesota’s Department of Human Services (DHS) is administering several grant programs to help stabilize the child care industry in Minnesota, including the Child Care Stabilization Base Grant (CCSBG), which is available to all eligible child care providers for monthly grant awards. As part of a portfolio to learn what works to support equitable delivery of ARP, the Office of Evaluation Sciences (OES) at the U.S. General Services Administration collaborated with DHS on a randomized evaluation that seeks to understand the extent to which additional outreach strategies and modalities are effective at increasing take-up of stabilization grants among child care providers.
External Link(s)

Registration Citation

Citation
Duru, Maya et al. 2023. "Increasing Access to Minnesota Child Care Stabilization Base Grants." AEA RCT Registry. May 17. https://doi.org/10.1257/rct.11435-1.0
Experimental Details

Interventions

Intervention(s)
OES randomized child care providers to one of three groups: (1) business-as-usual outreach, (2) a communications-bundle of behaviorally-informed emails and text messages; or (3) the same communications-bundle with an additional proactive phone call.
Intervention Start Date
2023-02-10
Intervention End Date
2023-07-01

Primary Outcomes

Primary Outcomes (end points)
A monthly dichotomous indicator equal to one if the provider was awarded funding for CCSBG for that license ID; and, zero otherwise.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Whether the provider applied in a given month;
The monthly amount of funding received (imputed to 0 if the provider did not apply that month);
Whether the provider is eligible for funding in a given month;
Whether a site's license ID is not active in a given month (3 months after the intervention for the LATE analysis); 0 otherwise; and
A dichotomous indicator for whether the center was active (i.e., a center opened with an active license) in a given month (including 3 months after the end of the intervention).
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
The study providers will consist of providers who apply five or fewer times in the six months prior to the beginning of the study. Providers who have a common program address (e.g., two programs with the same address) or who share the same contact information (e.g., same email address) will be clustered, and considered to be the same provider.
OES randomized child care providers to one of three groups: (1) business-as-usual outreach, (2) a communications-bundle of behaviorally-informed emails and text messages; or (3) the same communications-bundle with an additional proactive phone call.
Among the sample of provider clusters the probabilities of assignment to the business-as-usual group was 50% as was assignment to the text group (comprising either the text and email communication bundle or the text, email, and phone call bundle). Assignment to the text and email group did not differ based on provider characteristics. Some providers in the text and email group also were randomly assigned to the text, email, and phone call group. This probability differed based on the frequency of their applications to CCSBG prior to the start of the evaluation. The overall probability of assignment to the phone call group over the 5 month implementation period was:
43% of provider clusters that had applied zero times (N=674 provider clusters; 684 License IDs);
14% of provider clusters that had applied one to four times (N=132 provider clusters; 139 License IDs); and
17% of provider clusters that had applied five times (N=88 provider clusters; 89 License IDs).
Additionally, we randomly assigned the order in which phone calls were made. Among provider clusters assigned to the phone call group, each provider was randomly assigned to a phone call batch of 50 phone numbers (i.e., Batch 1 - 18).
Experimental Design Details
Our randomization blocks were created from the combination of three variables:
A categorical measure for the number of times a program has applied for CCSBG in the six application windows between August 2022 and January 2023. This measure includes three (3) categories:
Applied 0 times;
Applied 1-4 times; and
Applied 5 times.
A categorical measure for provider type. This measure includes two categories:
Family child care center; and
Child care center or certified child care center.
A categorical measure for variation of the opt out text sent as part of an earlier evaluation:
Standard opt out text variation; or
Transparent default opt out text variation.

Thus, we randomize within 12 block combinations (3 application count categories x 2 provider types x 2 versions of opt out text).
Randomization Method
Randomization was done in an office using the statistical software Stata.
Randomization Unit
We cluster providers according to whether they share the same contact information, or physical address for a center. Outcomes will be collected at the license ID level, so that there is one observation for each license ID.
Was the treatment clustered?
Yes

Experiment Characteristics

Sample size: planned number of clusters
3,265 provider groups
Sample size: planned number of observations
3,374 license IDs and 5 months of outcome data, for a total of16,870 expected observations.
Sample size (or number of clusters) by treatment arms
1,636 provider clusters (1,698 license IDs) assigned to business-as-usual
1,629 provider clusters(1,676 license IDs) assigned to the text and email group, including 894 provider clusters (912 license IDs) assigned to the phone call group
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
We conduct a simulation power analysis that accounts for the sample design, clustering, and analytic strategy described above. For each of 1,000 simulated treatment assignments that respect our sampling design and observed pre-treatment data, we simulate outcomes drawn from a range of possible effect sizes between one and five percent. We have multiple potentially correlated effects of interest (i.e. email with text and email, text, with phone call bundles), so for ease of analysis we fix one effect and vary the other. Assuming a phone bundle effect of 4 percentage points, our simulation finds that we have 80 percent power to detect a minimum detectable effect of approximately 2 percentage points for the phone bundle.
IRB

Institutional Review Boards (IRBs)

IRB Name
University of Maryland, College Park
IRB Approval Date
2023-01-17
IRB Approval Number
NA
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials