The American Economic Association's registry for randomized controlled trials
Honesty in the Digital Age
Last registered on July 23, 2018
View Trial History
Honesty in the Digital Age
Initial registration date
July 13, 2018
July 23, 2018 4:14 AM EDT
University of Michigan
Contact Primary Investigator
Other Primary Investigator(s)
University of Zurich
Additional Trial Information
Modern communication technologies enable efficient exchange of information, but often sacrifice
direct human interaction inherent in more traditional forms of communication. This raises the
question of whether the lack of personal interaction induces individuals to exploit informational
asymmetries. We conducted two experiments with 866 subjects to examine how human versus
machine interaction influences cheating for financial gain. We find that individuals cheat about
three times more when they interact with a machine rather than a person, regardless of whether
the machine is equipped with human features. When interacting with a human, individuals are
particularly reluctant to report unlikely favorable outcomes, which is consistent with social image
concerns. The second experiment shows that dishonest individuals prefer to interact with a machine
when facing an opportunity to cheat. Our results suggest that human interaction is key to mitigating
dishonest behavior and that self-selection into communication channels can be used to screen for
Cohn, Alain, Tobias Gesche and Michel Marechal. 2018. "Honesty in the Digital Age." AEA RCT Registry. July 23.
Cohn, Alain et al. 2018. "Honesty in the Digital Age." AEA RCT Registry. July 23.
Sponsors & Partners
Subjects performed coin tosses and reported their outcomes via different communication channels.
Intervention Start Date
Intervention End Date
Primary Outcomes (end points)
number of successful coin tosses
Primary Outcomes (explanation)
Secondary Outcomes (end points)
Secondary Outcomes (explanation)
We varied two factors using a two-by-two factorial design: (i) whether subjects reported their outcomes to a person or a machine, and (ii)
whether the interaction involved oral or written communication.
Experimental Design Details
Was the treatment clustered?
Sample size: planned number of clusters
Sample size: planned number of observations
Sample size (or number of clusters) by treatment arms
between 86 and 161
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Supporting Documents and Materials
INSTITUTIONAL REVIEW BOARDS (IRBs)
IRB Approval Date
IRB Approval Number
Post Trial Information
Is the intervention completed?
Is data collection complete?
Is public data available?
Reports and Papers