The slider task: An oTree implementation for studying incentive effects in online experiments

Last registered on February 10, 2026

Pre-Trial

Trial Information

General Information

Title
The slider task: An oTree implementation for studying incentive effects in online experiments
RCT ID
AEARCTR-0016367
Initial registration date
February 02, 2026

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
February 10, 2026, 5:59 AM EST

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region

Primary Investigator

Affiliation
Universität Hamburg

Other Primary Investigator(s)

PI Affiliation
Università degli Studi di Pisa

Additional Trial Information

Status
In development
Start date
2026-02-03
End date
2026-02-28
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
The slider task is one of the most utilized computerized real-effort tasks in experimental research. However, previous laboratory results suggest that it might lead to boundary solutions and therefore be prone to type-II errors when used to detect treatment effects on effort provision. We introduce a novel oTree implementation of the slider task and assess its output elasticity in online experiments, where its properties are likely to differ from laboratory environments. In particular, we advance that subjects exhibit a greater response to incentives when participating online, due to the higher opportunity costs associated with completing the task from home. This result would suggest that, in online environments, the slider task is well-suited to detecting treatment-induced changes in output. We conduct a between-subjects online experiment in which participants complete the slider task under varying monetary incentives. We compare the incentive effects observed in our study to those reported in similar laboratory experiments.
External Link(s)

Registration Citation

Citation
Pinna, Lorenzo and Riccardo Vannozzi. 2026. "The slider task: An oTree implementation for studying incentive effects in online experiments." AEA RCT Registry. February 10. https://doi.org/10.1257/rct.16367-1.0
Experimental Details

Interventions

Intervention(s)
Random allocation of subjects to treatment groups via online experiment.
In a between-subjects design, participants receive either a low (0.005 USD), medium (0.02 USD), or high (0.08 USD) piece-rate payment for each point they score in the slider task.
Intervention Start Date
2026-02-03
Intervention End Date
2026-02-28

Primary Outcomes

Primary Outcomes (end points)
Effort provision (scores in the slider task)
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Usage of outside options
Secondary Outcomes (explanation)
We measure the time participants spend away from the slider task. Specifically, we record timestamps at which the slider task window becomes inactive (e.g., when participants switch to another browser tab or window). We construct a measure of outside option usage as the total number of seconds during which the slider task window is not active.

Experimental Design

Experimental Design
In an online experiments, participants complete 10 rounds of the slider task with monetary incentives. In each round of the task, they see a screen containing 48 sliders, which they can adjust using the mouse. Their scores, interpreted as effort exerted, correspond to the number of knobs positioned at the midpoint of their respective slider. Participants earn a fix payment for each point they score. Using a between-subjects design, participants are randomly assigned to receive either a low (0.005 USD), medium (0.02 USD), or high (0.08 USD) piece-rate payment for each point they score.

Experimental Design Details
Not available
Randomization Method
Randomization done by a computer
Randomization Unit
Individuals
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
180 individuals
Sample size: planned number of observations
180 individuals
Sample size (or number of clusters) by treatment arms
60 individuals per treatment
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
Ethics Committee of the Faculty of Business, Economics and Social Sciences at the Universität Hamburg
IRB Approval Date
2025-07-29
IRB Approval Number
2025-043
Analysis Plan

Analysis Plan Documents

Analysis plan: Slider task

MD5: 57e332aa834ab834a56d24979ac7a1b7

SHA1: 231e97e2d193e94bec5b1a78b943cb17514c1c00

Uploaded At: February 02, 2026