Back to History Current Version

Do workers discriminate against their out-group employers? Evidence from an online labor market

Last registered on March 31, 2019

Pre-Trial

Trial Information

General Information

Title
Do workers discriminate against their out-group employers? Evidence from an online labor market
RCT ID
AEARCTR-0003885
Initial registration date
March 31, 2019

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
March 31, 2019, 11:20 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Primary Investigator

Affiliation
Iowa State University

Other Primary Investigator(s)

PI Affiliation
Indian Institute of Management, Bangalore
PI Affiliation
Iowa State University

Additional Trial Information

Status
In development
Start date
2019-04-01
End date
2019-12-31
Secondary IDs
Abstract
A large body of literature in economics has demonstrated that prejudice or bias of the majority group towards members of an out-group identity – whether it be racial, religious, ethnic or gender in origin – is widespread in labor markets. Such biases often lead to discrimination. It is commonly believed that labor market discrimination is one-sided: driven by employers toward their out-group employees. In this research, we restrict attention to racial identity and seek to study possible discrimination in the reverse direction, i.e., we ask, do workers discriminate on the intensive margin (say, by shirking or under-providing effort) for an out-group employer relative to an otherwise-identical, own-group one? We design a large scale real effort experiment on Amazon's Mechanical Turk to answer our research question.
External Link(s)

Registration Citation

Citation
Asad, Sher Afghan, Ritwik Banerjee and Joydeep Bhattacharya. 2019. "Do workers discriminate against their out-group employers? Evidence from an online labor market." AEA RCT Registry. March 31. https://doi.org/10.1257/rct.3885-1.0
Former Citation
Asad, Sher Afghan, Ritwik Banerjee and Joydeep Bhattacharya. 2019. "Do workers discriminate against their out-group employers? Evidence from an online labor market." AEA RCT Registry. March 31. https://www.socialscienceregistry.org/trials/3885/history/44422
Sponsors & Partners

Sponsors

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
Intervention Start Date
2019-04-15
Intervention End Date
2019-05-26

Primary Outcomes

Primary Outcomes (end points)
Each worker in this experiment will work on a simple button-pressing task, alternating `a' and `b' on the keyboard, to score 'points'. A number of points scored by each worker (effort) on the given task in a given treatment will be the primary outcome of interest.
Primary Outcomes (explanation)
Our measure of 'discrimination' will be constructed using effort choices of workers when working for the Black employer versus effort choice when working for the White employer.

Secondary Outcomes

Secondary Outcomes (end points)
Beliefs on demographics of the racial groups Black and White.
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
In this experiment, each worker will be randomly assigned to one of the following ten treatments and will then work on a simple button pressing task, alternating `a' and `b' button presses on the keyboard, to score `points'. Workers' payment scheme and the matched employers will vary depending on the assigned treatment. Here is a list of treatments.
1. Piece Rate – 0 cents: A worker’s payment will be unaffected by the number of points he/she scores in the task. No matched employer.
2. Piece Rate – 3 cents: A worker will be paid 3 cents for every 100 points he/she scores in the task. No matched employer
3. Piece Rate – 6 cents: A worker will be paid 6 cents for every 100 points he/she scores in the task. No matched employer
4. Piece Rate – 9 cents: A worker will be paid 9 cents for every 100 points he/she scores in the task. No matched employer
5. Altruism Baseline: A worker’s payment will be unaffected by the number of points he/she scores in the task. Worker’s matched employer will be paid 1 cent for every 100 points scored by the worker. The employer identity will be hidden.
6. Altruism Black: Earning rule will be the same as in the Altruism Baseline for both the worker and the employer. The employer’s forearm and hand will reveal dark/white skin color in the video. The employer will be Black.
7. Altruism White: Earning rule will be the same as in the Altruism Baseline for both the worker and the employer. The employer’s forearm and hand will reveal dark/white skin color in the video. The employer will be White.
8. Reciprocity Baseline: A worker’s payment is unaffected by the number of points he scores in the task. The worker will be paid 20 cents extra as a reward before the task begins. Worker’s matched employer will be paid 1 cent for every 100 points scored by the worker. The employer identity will be hidden.
9. Reciprocity Black: Earning rule will be the same as in the Reciprocity Baseline for both the worker and the employer. The employer will be Black.
10. Reciprocity White: Earning rule will be the same as in the Reciprocity Baseline for both the worker and the employer. The employer will be White.
Experimental Design Details
This experiment will recruit subjects from Amazon’s Mechanical Turk (M-Turk) and black & white student subjects from Iowa State University. The student subjects will be “Employers” while M-Turk subjects will be “Workers.” Each worker will be randomly assigned to one of the ten treatments (given above) and will then work on a simple button pressing task, alternating `a' and `b' button presses on the keyboard, to score `points'. In 6 social preference treatments, each worker will be matched with an employer. Worker's performance will determine how much he and his matched employer earns. The employer will not get to make any strategic choices (such as wage offer, minutes of work, etc.) thereby eliminating most channels for statistical discrimination by workers.

We take the approach of revealing race via the revelation of skin-color. To that end, “employer-students” will be videotaped while they read off a script explaining and demonstrating the “a-b” task. The camera placement will only capture the forearm of the employer along with the movement of the fingers alternating ‘a’ and ‘b’ button presses. Other identifiers, such as the face, will not be shown in the video. The employer’s hand and forearm will be bare or covered (with full sleeves and typing gloves) depending on the assigned treatment. The audio in the video will be partially digitized to reduce race markers from the voice in the relevant treatments.

Having video-recorded the employers, we will recruit subjects from M-Turk to work on the button-pressing task. Each worker will be randomly matched with an employer and will be given up to 10 minutes to work on the task. Before a worker starts, however, he/she will have to watch a pre-recorded video explaining the task. The video in the baseline (race-salient) setting will entirely conceal (reveal) the skin color of the employer. The random assignment of a worker to a video will determine the treatment assignment for the worker. Upon completion, there will be some follow-up questions aimed at eliciting beliefs about the matched employer.

To avoid confounds from different social identities, we will restrict only to male employers. Based on our pilot for this study, it is difficult to recruit a representative number of Black workers from M-Turk to make a credible inference. Therefore we restrict to only white workers and study their effort choices for Blacks versus White employers.
Randomization Method
Done by Qualtrics' randomization feature as the worker joins the study.
Randomization Unit
Individual
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
6,000 individuals
The study will be kept open on Amazon Mechanical Turk until either 3 weeks have passed or 6000 subjects have completed the study, whichever comes first. If three weeks pass without 6,000 subjects completing the study, then the study will be kept open (up to six weeks) until 6,000 subjects are obtained.
Sample size: planned number of observations
6,000 individuals
Sample size (or number of clusters) by treatment arms
600 individuals per treatment
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials