From Degrees to Skills: How Job Requirements Shape Applicant Composition and Fit

Last registered on November 17, 2025

Pre-Trial

Trial Information

General Information

Title
From Degrees to Skills: How Job Requirements Shape Applicant Composition and Fit
RCT ID
AEARCTR-0016070
Initial registration date
August 21, 2025

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
August 22, 2025, 6:12 AM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
November 17, 2025, 1:05 PM EST

Last updated is the most recent time when changes to the trial's registration were published.

Locations

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Primary Investigator

Affiliation
Georgetown University

Other Primary Investigator(s)

Additional Trial Information

Status
In development
Start date
2025-11-18
End date
2026-03-31
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
This paper examines how skill-based hiring practices influence the composition of the applicant pool, focusing on candidates’ educational backgrounds, skill profiles, race, and gender. Evidence shows that degree requirements often exclude qualified candidates and reinforce inequality, prompting many employers to adopt skill-based practices such as removing degree requirements to signal that skills, rather than credentials, will drive hiring decisions. Yet little is known about how such requirements affect job seekers’ application behavior. To address this gap, I partnered with a firm hiring for entry-level positions in software engineering and marketing and randomized the educational qualifications stated in job postings across three conditions: (1) a bachelor’s degree explicitly required, (2) no degree requirement stated, and (3) an explicit commitment to skill-based hiring, emphasizing alternative pathways and skill assessments. Using unique identifiers in emails and job board postings, I track candidate engagement (clicks, visits, and applications) and application outcomes. The primary outcomes are (i) the diversity of the applicant pool by education, race, and gender; (ii) the degree of skill match between applicants and job requirements; and (iii) organizational search costs. This design provides evidence on whether removing degree requirements and emphasizing skill assessments expands access for non-degree holders and underrepresented groups, while also assessing potential trade-offs in applicant quality and recruitment efficiency.
External Link(s)

Registration Citation

Citation
Oseguera, Mariana . 2025. "From Degrees to Skills: How Job Requirements Shape Applicant Composition and Fit." AEA RCT Registry. November 17. https://doi.org/10.1257/rct.16070-2.0
Sponsors & Partners

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
The intervention consists of randomizing the educational qualifications stated in job postings for entry-level software engineering and marketing positions. Each applicant is exposed to one of three versions of the posting:

Degree Required: The job description explicitly requires a bachelor’s degree.

Degree Silent: The job description does not mention educational requirements.

Skill-Based Hiring: The job description explicitly states that the company welcomes candidates who have developed skills through diverse pathways and/or that hiring decisions will be based on skill assessments.

Candidates are recruited through two channels: (a) direct outreach via email campaigns using candidate lists from ZipRecruiter, and (b) organic job board traffic to the firm’s career site. Randomization occurs at the individual candidate level through unique identifiers embedded in email links and randomized landing pages. Once assigned, candidates consistently view the same condition.
Intervention Start Date
2025-11-18
Intervention End Date
2026-03-31

Primary Outcomes

Primary Outcomes (end points)
Application behavior

Whether a candidate clicks on the job posting link (click-through rate).

Whether a candidate submits a job application (application rate).

Applicant composition

Educational background: share of applicants with and without a college degree.

Demographics: share of applicants by gender and race/ethnicity.

Applicant quality (skill match)

Degree to which applicants’ résumés and experience align with the job’s required skills, assessed using (a) structured evaluations by HR managers (blinded to treatment) and (b) automated résumé parsing/assessment tools.

Recruitment efficiency (search costs)

Number of applications received per posting.

Time/resources required for screening applicants, proxied by the distribution of low-match vs. high-match applicants.
Primary Outcomes (explanation)
Application behavior

Click-through rate: binary indicator equal to 1 if a candidate clicks on the unique job link in the outreach email or job board posting.

Application rate: binary indicator equal to 1 if a candidate submits a complete job application through the company’s careers page.

Applicant composition

Educational background: measured using self-reported education on the job application form. For non-applicants in the controlled ZipRecruiter sample, educational background will be supplemented with information available from ZipRecruiter.

Demographics: measured using self-reported gender and race/ethnicity on the application form. For non-applicants in the controlled sample, gender and race/ethnicity will be inferred algorithmically (using Gender/Race APIs based on names).

Applicant quality (skill match)

HR rubric evaluation: As part of its regular hiring process, the organization’s HR managers will review applications using a structured evaluation rubric adapted from Nichols et al. (2023). The rubric assesses candidates on relevant experience and interpersonal/leadership potential, using a 1–5 scale.

HR surveys and interviews: HR managers will also complete short surveys and participate in interviews to discuss how they apply the rubric and assess candidate qualifications. These materials provide additional insight into how evaluators interpret résumés, though they will not influence scoring. The order of information modules (e.g., education, work experience, skills) will be randomized. This ensures that no single résumé attribute is always presented first and allows us to test whether sequencing influences evaluators’ perceptions of candidate fit. Each profile will be recorded during the review process, and managers’ comments and reactions will be captured through surveys and interviews. These qualitative data are exploratory and intended to provide context for how résumé information is interpreted across treatment conditions.

AI-powered assessment tool: The organization’s existing automated résumé screening system will also be used in the same way it is ordinarily deployed for initial candidate review. The scores generated by this tool will be recorded for research purposes.

Recruitment efficiency (search costs)

Constructed from the distribution of rubric and AI-assessed applicant quality, capturing how many low-match versus high-match applicants each posting attracts. This provides a proxy for the organization’s screening costs.

Note: The organization retains full discretion over final hiring decisions and any subsequent recruitment stages. These outcomes are not part of the study and will not be shared with the PI. All candidates are considered equally by the company, regardless of which job ad version they viewed.
Click-through rate: constructed as a binary indicator equal to 1 if a candidate clicks on the unique job link in the outreach email or job board posting.

Application rate: constructed as a binary indicator equal to 1 if a candidate submits a complete job application through the company’s careers page.

Applicant composition

Educational background: measured using information self-reported in the application form (highest degree completed). For non-applicants in the controlled ZipRecruiter sample, education data will be supplemented with background information provided by ZipRecruiter.

Demographics: gender and race/ethnicity will be coded based on self-reported application data. For non-applicants in the controlled sample, gender and race/ethnicity will be inferred using algorithmic classification based on names (Gender/Race APIs).

Secondary Outcomes

Secondary Outcomes (end points)
Job search engagement beyond applications

Visits without application: measured as the number of unique visitors who clicked on a job posting but did not complete an application. This captures candidates who were initially interested but decided not to apply after viewing the posting.

Time on page: for candidates who clicked but did not apply, session data (e.g., time spent on the job posting page) will be recorded as an additional indicator of engagement.

Heterogeneity by occupation and recruitment channel

Occupation-specific effects: outcomes will be analyzed separately for software engineering and marketing roles, given differences in occupational cultures and hiring practices.

Recruitment channel: outcomes will also be compared between (a) candidates reached through ZipRecruiter outreach and (b) candidates arriving via organic job board traffic.

Demographic differences in engagement

Click vs. application by subgroup: differences between the proportion of candidates who click and the proportion who apply, broken down by gender, race/ethnicity, and educational background. This will help identify whether some groups are more likely to self-select out at different stages of the process.

Perceived candidate fit (exploratory)

HR surveys and interviews will be recorded and analyzed qualitatively to capture how managers interpret different résumé characteristics across conditions.

The order of candidate profile information modules (e.g., education, work experience, skills) will be randomized to test whether sequencing influences perceptions of candidate fit.

These insights will provide context for understanding patterns in applicant composition and quality.
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
This study examines how job posting language influences who applies for open positions. The partnering organization will vary the way educational qualifications are described across three versions of the same job: (1) a posting that requires a bachelor’s degree, (2) a posting that omits any mention of a degree requirement, and (3) a posting that explicitly welcomes candidates with diverse pathways to skills and/or notes that skill assessments may be part of the process.

Job openings will be advertised as part of the company’s regular hiring process in four occupations: software engineering, sales, office administration, and digital marketing. Applicants will encounter one version of the job posting at random, either through direct outreach or through the company’s career site. Application data will then be used to evaluate how the different postings affect the diversity, skills, and backgrounds of the applicant pool.
Experimental Design Details
Not available
Randomization Method
Randomization is done by a computer program at the individual candidate level. When candidates click on an outreach email link or visit the company’s careers page through a job board, they are automatically assigned by the system to one of the experimental conditions. Assignment is persistent through first-party cookies/session storage, ensuring candidates always see the same version if they return. No manual procedures, coin flips, or third-party cookies are used.
Randomization Unit
The unit of randomization is the individual job seeker. Each candidate is independently assigned to a treatment condition. Randomization occurs across all recruitment channels (ZipRecruiter outreach sample and organic job board sample).
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
0 (not clustered; randomization at the individual job‑seeker level).
Unit clarification: No clusters (e.g., no schools/firms). The analysis unit is individual candidates.
Sample size: planned number of observations
~10,000 individual candidates reached out via email (depending on qualified candidates in the pool) and additional organic job‑board visitors; the final total N will exceed 10,000. We will report the realized organic N and arm counts at the end of fielding.
Sample size (or number of clusters) by treatment arms
Sample ( N = 10,000; equal allocation, 1/3 each, may vary based on application rates):

Degree required — ~3,300

Degree silent — ~3,300

Skill‑based (welcoming + assessment) — ~3,300
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Power calculation: Minimum Detectable Effect Size (Main Outcomes) Unit: Individual candidate (first exposure). Clustering: None (randomization at individual level). Outcome: Application rate (binary). Baseline & SD: Baseline application rate 10% with SD = √[0.10·0.90] = 0.300; benchmarked to Hurst, Lee, & Frake (2024, Strategic Management Journal). Tests: Two‑sided, 80% power. MDEs shown for α = 0.05 (Bonferroni‑adjusted α = 0.0125 in parentheses). Conservatism: Estimates are conservative because they use only the email outreach pool (N = 10,000; 2,000 per arm) and exclude additional randomized organic traffic. As total N increases, MDEs decline ≈ 1/√N. MDEs (absolute percentage points): 2,000 vs 2,000 (single arm vs single arm): 2.66 pp (3.20 pp Bonferroni). 2,000 vs 6,000 (degree‑required vs pooled skill‑based arms): 2.17 pp (2.60 pp Bonferroni). I will recompute MDEs ex post using realized baselines and final sample sizes (including organic traffic).
IRB

Institutional Review Boards (IRBs)

IRB Name
Georgetown University IRB
IRB Approval Date
2025-07-29
IRB Approval Number
STUDY00009553