Signaling skills online and labor market outcomes: evidence from an adaptive experiment

Last registered on September 27, 2023

Pre-Trial

Trial Information

General Information

Title
Signaling skills online and labor market outcomes: evidence from an adaptive experiment
RCT ID
AEARCTR-0010085
Initial registration date
September 25, 2022

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
September 27, 2022, 11:54 AM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
September 27, 2023, 9:46 AM EDT

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Region

Primary Investigator

Affiliation

Other Primary Investigator(s)

PI Affiliation
PI Affiliation
PI Affiliation
CREST / LISN

Additional Trial Information

Status
In development
Start date
2023-01-09
End date
2023-12-31
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
Digital job tools promise to reduce frictions on the labor market. In this randomized controlled trial, we study an intervention seeking to increase the usage of a public online platform in France. The intervention, implemented with an adaptive design, consists in sending mails to job seekers, providing information, help and motivation to register or update their profiles on the platform. We seek to discover which treatments maximize uptake of the tool among the several types of incentives considered. We then analyze the impact of the platform's use on labor market outcomes.
External Link(s)

Registration Citation

Citation
Bied, Guillaume et al. 2023. "Signaling skills online and labor market outcomes: evidence from an adaptive experiment." AEA RCT Registry. September 27. https://doi.org/10.1257/rct.10085-3.1
Sponsors & Partners

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
We send emails to registered job seekers in order to encourage them to use their profile on the public employment service's platform, i.e to fill the profile and potentially publish it to become visible to recruiters. We test several emails to maximize take-up by either providing information or reducing the cost of the tool's use, in combination with behavioral levers. We also test two different sending hours.

- Informational treatments

(Info1) Information about the use by recruiters
In order to emphasize the potential gains of filling the profile, we highlight its use by other agents. We show information about the number of recruiters looking for candidates on the platform each month. We vary whether this information appears or not.

(Info2) Information about the gains in terms of service quality
Another information shared is related to the effect of providing information through the profile on the quality of the service given by the institution. Counselors rely on the profile to recommend job ads to job seekers and so do automatic recommender systems built by the institution. We highlight how a well-filled skill profile can enable Pôle emploi's counselors to better assist individuals in finding a job, a training program and more broadly communicate with them. We vary whether this information is present or not.

- Help provision treatments

(Help1) Including a list of steps
In order to reduce the cost of completion we write the various steps to follow in order to fully mobilize the tool directly in the body of the email. We vary whether this tutorial is proposed or not.

- Including an intensive help
Some job seekers may find it difficult to complete their profile independently. Some may have difficulties with the French language or with digital tools. Others may want to be accompanied in the process because of a lack of self-confidence. We propose an intensive help that can take 2 forms:
o (Help2workshop) Workshop: We propose job seekers to self-register to a half-day workshop designed to help them fill-in their profile at the PES.
o (Help2counselor) Counselor: We propose jobseekers to book an appointment with a counselor via a call-to-action button which redirects them to the appropriate webpage.
We vary whether the intensive help is proposed or not and whether it takes the form of a workshop or a counselor appointment.

- Scheduled time to send the email

(Schedule) Sending hours:
We vary whether the email is sent at 9am or at 3pm.

Possible combinations
We aim to test the interaction of some of these email contents. We selected a subset of interactions between email contents that we considered to have a chance of being relevant, leading to a total of 16 different emails summarized.

Since emails can be sent at either 9am or 3pm, this experiment includes a total of 32 treatment arms.
Intervention (Hidden)
Intervention Start Date
2023-01-10
Intervention End Date
2023-04-10

Primary Outcomes

Primary Outcomes (end points)
- Online profile related outcomes
- Impact on the individual's usage of the platform
- Labor market outcomes
Primary Outcomes (explanation)
- Online profile related outcomes: (1) Whether the job seeker connected to his account, went to the profile page for different time windows (3, 7 and 30 days) after sending emails and modified his/her profile (2) Whether the job seeker published his profile on the public employment service's website (3) Number of visits on the job seeker's profile by recruiters and or by caseworkers (4) An indicator of the degree of completion and quality of the profile in percentages.

- Usage of Pôle emploi's website in general: Whether the job seeker visits pages related to job search on Pôle emploi's website for different time windows (3, 7 and 30 days).

- Labor market outcomes: (1) number and type of job ads recommended to the job seeker by the caseworker and by automatic suggestions (2) number of times the jobseeker was contacted by recruiters on the website (3) number and type of ads the job seeker clicked on or applied to (4) return to employment (5) characterization of the job when re-employed.

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
Each week, we follow these steps:
1. We sample 36 000 job seekers from the eligible population at week t.
2. We split them in two samples: 2/3 of the sample is allocated to a control group (who does not receive any incentive) and the other 1/3 is allocated to a treated group (who receives one of the incentives according to estimated individual probability of allocation).
3. For the treated group, an algorithm* determines allocation probabilities for each individual; treatments are then assigned accordingly.
4. Individuals in the treated group are sent a message shortly after the assignment.
5. We observe whether individuals in the treated group clicked on the pages of the online profile. This information is used by the algorithm to update the allocation probabilities.

After at most 10 weeks, we stop the adaptive assignment and keep some observations for the evaluation of the assignment policy learnt.

*Description of the algorithm: we use a honest generalized random forest to predict the impact of each treatment based on data collected the previous weeks. Assignments are generated using a Thompson sampling procedure based on the mean and variances of the indiviudal predictions from the forest (the first week, interventions are assigned at random).
Experimental Design Details
Randomization Method
The allocation to the control or treated groups is done by a computer with uniform probabilities (with mean 1/3). Within the treated group, the kind of intervention is chosen using a contextual bandits algorithm (i.e, randomization done by a machine learning algorithm with personalized sampling weights).
Randomization Unit
The unit of the randomization is the job seeker.
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
No cluster
Sample size: planned number of observations
~ 450,000 observations, one third in the treated group.
Sample size (or number of clusters) by treatment arms
The treated group of size 150,000 is split between 32 treatment arms; the sample size for each arm is not known in advance (it is progressively adapted so that people are sent to the best treatments).
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
Institutional Review Board - Paris School of Economics - Ecole d'Economie de Paris
IRB Approval Date
2022-09-04
IRB Approval Number
2022 - 019

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials