Job Search - an Information Experiment

Last registered on September 13, 2021

Pre-Trial

Trial Information

General Information

Title
Job Search - an Information Experiment
RCT ID
AEARCTR-0008179
Initial registration date
September 12, 2021

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
September 13, 2021, 11:17 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region

Primary Investigator

Affiliation
University of Nottingham Ningbo China

Other Primary Investigator(s)

PI Affiliation
Beijing Normal University

Additional Trial Information

Status
In development
Start date
2021-09-13
End date
2023-12-31
Secondary IDs
National Natural Science Foundation of China (Project No. 71973016), the Beijing National Natural Science Foundation (Project No. 9192013)
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
We conduct a field experiment on a large Chinese online job board and analyze the effect of information provision and goal setting on the job search behavior of job seekers.
External Link(s)

Registration Citation

Citation
He, Haoran and Marcus Roel. 2021. "Job Search - an Information Experiment ." AEA RCT Registry. September 13. https://doi.org/10.1257/rct.8179-1.0
Sponsors & Partners

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
n/a
Intervention (Hidden)
We conduct a field experiment on a large Chinese online job board. The intervention takes place through an online survey (our main survey) that are pushed to job seekers through in-app messages. Participation is voluntary but compensated, which is common for online surveys carried out on the job board. The survey elicits their intended occupation, their beliefs (and their confidence in such beliefs) about the number of applications per vacancy in this occupation in the past month (referred to as “past prior beliefs” hereafter) as well as in the next month (referred to as “future prior beliefs” hereafter).

Then, the survey implements a 2x2 design that varies whether information about last month’s labor market competition is provided (info-treatment) and whether daily goals about the number of jobs they plan to apply for in the upcoming month are elicited (goal-treatment). Finally, future beliefs (referred to as “future posterior beliefs” hereafter) are once again elicited after information provision and/or goal setting. They are also elicited again in the no-information treatments to control for repeated elicitation effects.

At the time of writing this pre-registry file, it also appears that the job board has upgraded its administrative data recording system so that we may have access to further behavioral data besides what job ads they apply for that encompasses what job ads they open (but do not apply for), including the date/time they do so. We may also have access to the date, start and end time when job seekers use the app and, on the other side of the labor market, whether employers contact job seekers in response to their application via the internal message system. However, the availability and quality of this data are still uncertain.*

A month after the main survey has been completed by all participants, we conduct an online follow-up survey, which is pushed to all previous survey respondents through app message and/or cellphone text message, to collect their job search outcomes. However, this additional dataset is likely to be relatively small due to attrition and its purpose is mainly to provide robustness checks and additional supportive evidence for the main analysis.

*We had collaborated with the job-board in a previous study (He, Neumark and Weng, 2021). Since then, the job board has introduced novel new features and increased the tracking of their users. How viable some of these new measures will be is unclear at this point of time. We will comment on the potential analysis in our analysis plan below.
Intervention Start Date
2021-09-13
Intervention End Date
2021-12-31

Primary Outcomes

Primary Outcomes (end points)
- Future posterior beliefs: beliefs about the level of competition in their occupation after the intervention.

- Search effort: number of applications made following the intervention, number of job ads read but not applied to (if data from job board is available), time spent searching for jobs on the job board (if data available).
Primary Outcomes (explanation)
n/a

Secondary Outcomes

Secondary Outcomes (end points)
- Goals: intended daily number of applications.

- Reservation wage: proxied by the wage offers of jobs that the job seeker applies for.

- Occupation switching: proportion of applications made in the intended occupation, etc.

- Job search outcomes: employers’ responses to job seekers’ application (if data from job board is available); number of interviews, number of offers, whether a job was accepted, accepted wage, etc. (elicited in the follow-up survey and subject to strong attrition).
Secondary Outcomes (explanation)
n/a

Experimental Design

Experimental Design
We conduct a field experiment on a large Chinese online job board and analyze the effect of information provision and goal setting on job seekers’ search behavior.
Experimental Design Details
Experimental setting – the online job board and job seekers:
We conduct the experiment on an online job board, which is one of the largest nationwide online job boards in China. The job board posts tens of millions of job openings per year with more than 100million registered job seekers, of whom millions are active each day. The job board specializes in white-collar, high-education jobs, and job seekers are highly educated. Employers post job ads that are produced using a standard template and that list important information about the job and employee requirements.

In order to search and apply for jobs on the job board, job seekers need to first register and provide individual information to construct a standardized résumé. The required information includes, among other things, gender, birth date, location, education and work/internship experience, current employment status, and type of job sought with location, intended industry, occupation and salary range specified.
As researchers, we have full access to the job seekers’ (anonymized) résumé (extracted from the administrative database at the time the survey is completed) as well as for which jobs they apply for both one month before and one month after the survey (extracted from the administrative database after one month the survey is completed).

After registering and logging in, job seekers can click to open job ads that are listed on employers’ web pages or presented to them after searching the system using a search bar. When they open an ad, they see a full-page description and can then click the “Send Résumé” button to send their pre- generated résumé to the employer to apply for that job. The employer receives the résumés immediately. At the same time, a chat box is created so that the employer and the job seeker can interact freely with each other if they would like to. However, it is common that most further interactions take place outside of the job board. Consequently, it is unlikely that we learn (via the job board) whether a job seeker receives an interview or job offer and whether such an offer is accepted.


Sampling:

Our surveys are run in cooperation with the job board and features their official company name, which is shown to job seekers. We define a job seeker as active if she applied for at least one job on the job board yesterday. On the first day of the survey, the job board draws the population of all active job seekers of that day and pushes the main survey through an in-app message to all of them. On subsequent days, the job board redraws the population of all active job seekers and pushes the survey to those eligible. The survey will be terminated when the sample size reaches our target.


Treatments:

For an explanation of the treatments, please refer to the intervention section above.
Randomization Method
Randomization is done by the online survey system. We stratify based on job seekers intended occupation (elicited as the first question of the survey).
Randomization Unit
The unit of randomization is at the individual job-seeker level.
Was the treatment clustered?
Yes

Experiment Characteristics

Sample size: planned number of clusters
We cluster by intended occupations. The planned number of clusters is 54.
Sample size: planned number of observations
20 000 job seekers
Sample size (or number of clusters) by treatment arms
20 000/4
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
As the sample is large enough, power for economically significant effects is not a concern. But if needed, we will do so later after the data is collected and standard deviations are known.
IRB

Institutional Review Boards (IRBs)

IRB Name
Beijing Normal University Business School IRB
IRB Approval Date
2021-09-01
IRB Approval Number
BNU-BS-IRB 2021-036
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials