How does advice influence the job search behavior?

Last registered on January 13, 2022

Pre-Trial

Trial Information

General Information

Title
How does advice influence the job search behavior?
RCT ID
AEARCTR-0008809
Initial registration date
January 13, 2022

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
January 13, 2022, 8:29 AM EST

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Primary Investigator

Affiliation
Duke Kunshan University

Other Primary Investigator(s)

Additional Trial Information

Status
In development
Start date
2022-01-25
End date
2022-02-28
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
This study involves an online lab-style experiment in which participants will solve a search task 10 times. They search for wage offers by deciding on the reservation wage (minimum acceptable offer). Then, a computer randomly draws an offer from a known wage distribution. The offer may or may not be accepted by the participant depending on the value of the offer and the reservation wage. This is repeated until the participant accepts an offer or the search round is over.

We study in this context, how individuals advise each other about the choice of reservation wage. In treatment 1, individuals participate in the 10 rounds of search, after which they leave advice to a random participant of treatment 2. The advice consists of a recommended reservation wage and a free-form text message. Participants of treatment 2 will see the advice at the beginning of the experiment and then participate in the 10 rounds of search task. The payment of the participants in treatment 1 will be influenced by the search task of the participants in treatment 2 in order to give incentives for leaving meaningful advice.
We study what sort of advice individuals leave to their peers and how that influences the peers' decisions in treatment 2. In particular, we are interested in whether receiving advice from an experienced decision-maker will improve decisions in the search task in the sense that individuals choose a reservation wage closer to the optimal value.

We will have two types of settings, one without search costs and one with search costs, in which participants receive offers only after finishing a coding task. This means we will have two versions of treatment 1 and two versions of treatment 2.

External Link(s)

Registration Citation

Citation
Horvath, Gergely. 2022. "How does advice influence the job search behavior?." AEA RCT Registry. January 13. https://doi.org/10.1257/rct.8809-1.0
Experimental Details

Interventions

Intervention(s)
The experiment will consist of 4 treatments, in 2 pairs. In pair 1, there will be no search costs, in pair 2, search costs are represented by a real-effort coding task. In each pair, there will be two treatments, in Treatment 1, participants complete the search task 10 times and leave advice. In Treatment 2, participants receive advice and complete the search task 10 times.
Intervention Start Date
2022-01-25
Intervention End Date
2022-02-28

Primary Outcomes

Primary Outcomes (end points)
We record the reservation wage choices of the participants and the advices given (recommended reservation wage + free-form message).
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
The main part of the experiment consists of an infinite-horizon search task, played for 10 rounds. Within a round, the following sequence of events occurs. In period 1, participants choose a reservation wage (that, is, a minimum acceptable offer), after which an offer is uniform randomly drawn from the set of integer numbers between 1 and 100 points. Participants know the offer distribution and cannot change their reservation wage after seeing their offer received. If the offer received is at least as large as the reservation wage, the participant accepts the offer and earns the value of the offer for the remaining periods of the round, without making any further
decisions. Otherwise, she moves to the next period where she chooses a reservation wage and draws an offer again, without being able to recall previous offers. In period 1 and in every subsequent period of searching, participants receive 30 points which is not paid after an offer is accepted. This process is repeated until the end of the round.

The number of periods in a round is determined randomly determined: after any given period, with 95% probability, there will be a next period, while with 5% probability, the round ends. This random termination method creates infinite horizon in the experiment.

The sum of points earned in all periods of a round constitutes the participants’ payoffs from a round. The payoffs from the whole search task are the participant’s payoffs from a randomly chosen round.

We have two versions of this search task: one as described above, another with search costs. Search costs will be implemented by a real-effort coding task. After submitting the reservation wage, participants will have to code three letters to numbers using a coding table. They will receive their offer only after finishing the coding correctly.

Both for the conditions without and with search costs, we will have two treatments. Treatment 1 consists of three parts. In Part 1,
participants complete the 10 periods of search task as described above. In Part 2, they leave advice to a randomly chosen participant of Treatment 2 suggesting a strategy to follow in the search task. The advice consists of a recommended reservation wage and a free-form text
message. Advice giving is incentivized: the advisor’s payoffs from Part 2 are equal to half of the payoffs of their advisee in the search task. In Part 3, participants fill out a survey which asks about demographic information (gender, age, education, student status), and contains a risk preference elicitation task and a cognitive reflection test, both of the latter are incentivized.


Treatment 2 consists of two parts. Part 1 contains the 10 rounds of search task. Before starting the first round, each participant in Treatment 2 receives advice from a randomly chosen participant of Treatment 1. Part 2 contains the same survey as described above. In both
treatments, the total payoffs of a participant is the sum of payoffs from all parts of the experiment.
Experimental Design Details
We ensure that the search tasks in the two treatments are experimentally comparable in the sense that the information set of the participants and their expected payoffs before the search task are the same. We achieve this by first giving the experimental instructions of the search task (namely Part 1 in both treatments) to the participants and showing them the instructions of the latter parts only after they completed the search task. In addition, participants of Treatment 2 not know that their choices in the search task affect the payoffs of the participants in Treatment
1. This rules out any influence of other-regarding preferences when making decisions in the search task.
Randomization Method
The randomization is done by the computer, and the arrival of participants to the experiment on the Prolific website, where we recruit the participants.
Randomization Unit
Prolific registered and active participants.
Was the treatment clustered?
Yes

Experiment Characteristics

Sample size: planned number of clusters
We will recruit 400 participants in total. 100 in Treatment 1, no search cost. 100 in Treatment 2, no search costs. 100 in Treatment 1, with search cost. 100 in Treatment 2, with search costs.
Sample size: planned number of observations
For reservation wage: 100 participants * 10 rounds * the number of periods they search (endogenous) * 4 Treatments For advice: 100 participants in Treatment 1, no search costs + 100 participants in Treatment 1, with search costs
Sample size (or number of clusters) by treatment arms
For Reservation wage: 100 participants * 10 rounds * the number of periods they search (endogenous)
For Advice: 100 participants
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
Duke Kunshan University Institutional Review Board
IRB Approval Date
2021-10-14
IRB Approval Number
2021GH085

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials