The Labor Market Outcomes of Adult Learners with Online Bachelor's Degrees: An Audit Correspondence Study

Last registered on December 21, 2023

Pre-Trial

Trial Information

General Information

Title
The Labor Market Outcomes of Adult Learners with Online Bachelor's Degrees: An Audit Correspondence Study
RCT ID
AEARCTR-0012692
Initial registration date
December 17, 2023

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
December 21, 2023, 7:55 AM EST

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region

Primary Investigator

Affiliation
Pardee RAND Graduate School

Other Primary Investigator(s)

Additional Trial Information

Status
In development
Start date
2023-12-19
End date
2024-03-01
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
Many workers pursue higher education online to advance their careers. However, employers favor job candidates with in-person degrees in hypothetical matchups (Kaupins et al., 2014; Kohlmeyer et al., 2011; Roberto & Johnson, 2019; Tabatabaei et al., 2014). In open-ended questioning, employers express concerns about online degree holders' soft skills and coursework rigor (Kohlmeyer et al., 2011; Roberto & Johnson, 2019).

These studies rely on employers' self-reported preferences rather than observing their true hiring practices, potentially introducing social desirability bias. Audit correspondence field experiments examine hiring bias in real-life scenarios by generating fictitious resumes and submitting them to actual job openings. Audit studies also eliminate confounders by creating an artificial labor pool with no average differences between groups and randomizing key characteristics (Neumark, 2018).

Two previous audit studies by Deming et al. (2016) and Lennon (2021) examined employer perceptions of online bachelor's degrees using fictitious resumes of young college graduates in their twenties. They found that job candidates with online business degrees received about one-fourth (Deming et al., 2016) to one-half (Lennon, 2021) fewer callbacks than in-person graduates. These findings have limited generalizability because the research designs excluded the modal online student—adult learners in their thirties (Friedman, 2017)—whose long work histories may assure employers of their productivity.

This study aims to determine if Deming et al. and Lennon's results hold for adult learners. Using an audit correspondence field experiment, I test if adult learners with bachelor's degrees from online colleges are less likely to receive a job callback than those with bachelor's degrees from brick-and-mortar colleges. I construct fictitious resumes of adult learners using publicly available resumes from a large job search website and a resume characteristic randomizer (Lahey & Beasley, 2009). I draw from prior literature to covey gender and race (through applicant name – see Gaddis, 2017) and age (through high school graduation date – see Neumark, Burn, and Button, 2019). To depict adult learners' professional experience, I fill the time between high school graduation and college enrollment with ten years of work experience and add four additional years of work experience during college enrollment. Finally, I randomly assign the college type—online or brick-and-mortar—before submitting applications to job openings in business administration.
External Link(s)

Registration Citation

Citation
Greer, Lucas. 2023. "The Labor Market Outcomes of Adult Learners with Online Bachelor's Degrees: An Audit Correspondence Study." AEA RCT Registry. December 21. https://doi.org/10.1257/rct.12692-1.0
Experimental Details

Interventions

Intervention(s)
The experimental variation requires that each matched pair of fictitious applicants includes one bachelor's degree holder from an online college and one bachelor's degree holder from a brick-and-mortar college.
Intervention Start Date
2023-12-19
Intervention End Date
2024-03-01

Primary Outcomes

Primary Outcomes (end points)
Job callbacks from employers
Primary Outcomes (explanation)
The outcome measure is job callbacks via phone, email, or message through the job searching website. I will track all callbacks and record them as positive (an explicit interview offer), ambiguous (a request for more information or to discuss the position further), or negative (a rejection or no response within four weeks of the application). A callback is defined as any positive or ambiguous response. As a robustness check, I will redefine callbacks as a positive response only (explicit request for an interview).

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
The audit correspondence study will use a matched design, where I send, in random order, one resume and cover letter from an online degree holder to a job vacancy and a second from an in-person degree holder. I apply to vacancies in business administration across four occupation types: accounting, finance, management, and marketing. Vacancies are also spread across eight large metropolitan areas that vary by geographic region (Northeast, South, West, and Midwest) and bachelor’s degree attainment. I find vacancies on a national hiring website by searching for jobs that require a bachelor’s degree in accounting, finance, management, marketing, or a similar field, and then select a subset of vacancies appropriate for recent graduates.

Fictitious resumes will vary by college type, race, gender, work history, and skills. To ground my fictitious resumes in empirical evidence, I extract information from publicly available resumes on a job-searching website and reconstruct new resumes via a computer program that randomly selects resume characteristics (Lahey and Beasley, 2009). Resumes are anonymized (i.e., altered dates, employer names, and work descriptions) to ensure that fictitious work histories contain an assortment of jobs from real resumes but are not identifiable. Fictitious resumes are randomly assigned as one of two templates. Both templates have the same sections, including name, contact information, education, work history, and skills, but differ slightly by format, such as the font or bullet types, to avoid being suspiciously similar.

The education section lists one bachelor’s degree in business administration from a four-year public or nonprofit school. While all degrees are in a business field, the degree’s specialization align with the occupation type (e.g., applicants for finance positions will have a bachelor’s in finance or a bachelor’s in business administration with a concentration in finance). The exact names of the degrees differ slightly to represent the name of the program at the listed college but are generally equivalent. All profiles complete their bachelor’s degrees after four years of schooling and will not include a GPA.

The education section also lists high school education to convey age (see Neumark, Burn, & Button, 2019). Adult learners are typically defined as age 25 and older but that age does not represent online adult learners. The average age of an online student is 32 years old (Freidman, 2017). Assuming students graduated high school at 18 years old, I use 2009 as the average high school graduation year. To minimize the risk of detection from employers receiving two resumes with the same high school graduation year, I randomly vary one resume by plus or minus one year. Like Deming et al. (2016), I use the Common Core of Data to randomly assign public high schools from a combined statistical area to fictitious candidates with the same race as the median student. I also control for socioeconomic status by filtering out high schools in the highest and lowest quartile of free- or reduced-price lunch. Finally, I verify the selected schools have operated since 2000 and do not have other notable academic standing (e.g., selective enrollment, nationally recognized, unique pedagogies).

The experimental variation requires that each matched pair of resumes includes one online and one brick-and-mortar college. As in Deming et al. (2016), I define online colleges as those where online degrees predominate, regardless of the existence of some in-person offerings. The degree modality is not listed but is conveyed by the name recognition of predominately online colleges. To maximize the signal’s salience, I use online colleges that awarded high numbers of bachelor’s degrees in business and have operated online for at least five years.

The control resume in each pair lists a bachelor's degree from a brick-and-mortar institution near the regional labor market with little to no online offerings. Each labor market has one or two regional brick-and-mortar colleges to randomly draw from. I used fall 2019 data from the Integrated Postsecondary Education Data System (IPEDS) and the institutions' public websites to identify relevant regional institutions for each labor market. As online learning has become commonplace in postsecondary education, some of the selected brick-and-mortar institutions offer online business courses or a fully online master's degree program in business administration but do not have fully online bachelor's degree programs in business administration. In all cases, the selected brick-and-mortar institutions have very few undergraduate students enrolled exclusively online. Thus, employers are unlikely to perceive degrees from these institutions as being online. To avoid biasing results by college selectivity, the brick-and-mortar institutions are among the least selective in the region by acceptance rate.

I convey applicants’ race (white or Black) and gender (male or female) with their name. Names that convey race may also incidentally signal socioeconomic status (SES). To address this concern, I draw first names from Gaddis’s (2017) survey on the racial and SES perceptions of names from New York state birth records spanning 1994 to 2012. I select a subset of first names that are commonly perceived as white or Black (matching the race signal for at least 90 percent of survey respondents) and remove names from the first and fourth quartiles of mother’s education level (a proxy for strong SES signals). I also extract last names from Gaddis’s work, which are common surnames from 2012 Census Data with a population-level racial occurrence of at least 50% white or Black. I select four first names for each gender and race combination and four last names per racial group. Rather than using all sixty-four combinations of names, I take Bennett’s (2022) approach and select four full names for each gender and race combination.

Realistic resumes must also include work histories. I will fill the time between high school graduation (2009) and college enrollment (2019) with ten years of work experience. Since it is common for online students to work during their studies (Friedman, 2017), I will include an additional four years of work experience throughout college enrollment. The applicants’ long work histories are what set apart this audit from similar experiments about online degrees. To construct work histories, I extract job titles and descriptions from a sample of publicly available Indeed resumes. I screen for high-quality resumes (e.g., complete descriptions of job responsibilities) of online degree holders with prior work experience in each occupation type. The job descriptions are rephrased, anonymized (e.g., changing the employer’s name), and deconstructed as separate job templates. To ensure work histories look similar to adult learners on Indeed, the first work experience following high school will not directly align with the occupation type (e.g., cashier). I create four complete work history templates for each of the four occupation types. Work history templates are tailored to each metropolitan area by substituting the employers’ names with local employers or chain businesses that operate in the region.

For each occupation type, I generate relevant skill templates (technical and soft) with the same approach. The addition of skills provides a useful way to address Heckman’s critique of experimental labor discrimination studies. Heckman argues that differences in the variance of unobservable characteristics related to productivity can generate bias (Heckman, 1998). However, there is no evidence that online students have a higher variance of unobservable characteristics than in-person students. For the sake of robustness, I address the Heckman critique by following the approach used in Neumark, Burn, & Button (2019), and randomly assign additional skills to convey variations in human capital investment. I control for high skill resumes in the regression analysis.

The final resume elements to include are applicants’ address and contact information. Applicants live in the same metropolitan area as the vacancy to which they are applying. I select a subset of the fictitious addresses used in Neumark et al. (2019). To provide employers with several options to callback, I create Gmail accounts for each name combination and phone numbers for each metropolitan area and college type combination. Employers can also contact applicants through their Indeed profile.

Job vacancies that require a bachelor’s degree typically ask for, or at least allow, the submission of a cover letter. With each resume, I send a cover letter randomly assigned as one of two templates. The second fictitious applicant in the matched pair uses the other template. Like resumes, the cover letter templates slightly differ in formatting and phrasing but concisely address the same points, including expressing interest in the vacancy, a description of the candidate’s professional experience, and an explanation of the technical and soft skills the candidate would bring to the role. Time constraints prevent me from tailoring cover letters to individual job postings; however, they are tailored by occupation type to emphasize relevant professional experience and skills (e.g., accounting candidates have strong organizational skills and experience managing financial records).

Full data collection will begin in December 2023.
Experimental Design Details
Randomization Method
Randomization done in office by a computer
Randomization Unit
Employers
Was the treatment clustered?
Yes

Experiment Characteristics

Sample size: planned number of clusters
1,098 employers
Sample size: planned number of observations
2,096 resume and cover letter submissions
Sample size (or number of clusters) by treatment arms
1,098 resumes and cover letter submissions from fictitious candidates with bachelor's degrees from online colleges.
1,098 resumes and cover letter submissions from fictitious candidates with bachelor's degrees from brick-and-mortar colleges.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
During the first two weeks of October 2023, I conducted a pilot study to complete power calculations and refine logistical details. I sent 392 resumes to 196 vacancies. Online candidates received 31 callbacks (15.8%) and the in-person candidates received 29 (14.8%). This difference was not statistically significant using an Exact Fisher two-tailed test. Using the statistical software G*Power (Faul et al., 2007), I conduct a power analysis to estimate the smallest number of vacancies needed for the experiment. The software requires inputs for significance level, statistical power, and estimated callback rates. I use conventional standards for significance (α = 0.05) and power (β = 0.80) and pilot study estimates for callback rates (14.8% for in-person candidates). I also use a inter-correlational cluster factor of 0.2. I will need to apply to 1,098 vacancies to detect a 30 percent difference (4.44-percentage points) using an Exact Fisher two-tailed test. A higher powered study would be able to detect smaller differences; however, this study’s intent is to provide practical insights for the online postsecondary education sector, not detect the smallest difference possible. Small callback differences between online and in-person candidates are unlikely to influence adult learners’ college choices when measured against online education’s attractive features, such as flexibility and affordability. Therefore, I power the study to detect substantive differences between groups, and a smaller sample size can accomplish that. The target sample size from this power analysis serves as a minimum. If data collection is going smoothly, and I have the resources to apply to additional vacancies, I will continue data collection and analyze all of the data I am able to collect. As a stretch goal, I will attempt to apply to a maximum of 1,608 vacancies to detect a 25 percent difference (3.7-percentage points).
IRB

Institutional Review Boards (IRBs)

IRB Name
RAND Corporation
IRB Approval Date
2023-08-22
IRB Approval Number
2023-N0079

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials