You need to sign in or sign up before continuing.
Back to History Current Version

How does information affect decision-making?

Last registered on March 01, 2025

Pre-Trial

Trial Information

General Information

Title
How does information affect decision-making?
RCT ID
AEARCTR-0015126
Initial registration date
January 26, 2025

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
January 31, 2025, 5:14 AM EST

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
March 01, 2025, 9:20 AM EST

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Region

Primary Investigator

Affiliation
TAMU

Other Primary Investigator(s)

Additional Trial Information

Status
In development
Start date
2025-02-03
End date
2025-04-07
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
This paper quantifies individuals’ value for objective information and investigates whether this information changes their beliefs and food preferences. The overall hypothesis is that information frictions (i.e., access to scientific, timely research findings) can help explain failures in learning and informed decision-making. We use two studies to test this hypothesis and measure the demand for research findings on as well as their behavioral response to learning. In Study 1, we will quantify individuals' value for information about synthetic biology in food (i.e., lab-grown meat) and determine whether they update their beliefs when informed of its health and environmental effects. In study 1, we expect that individuals find peer-reviewed information and numeric evidence more valuable, while nonpeer review and descriptive evidence are less informative. Based on findings from Study 1, we will develop a behaviorally informed intervention for Study 2, in which we will test whether informing consumers about the health impacts of “artificial” food additives in “natural” foods changes food preferences. In sum, this article will provide direct evidence regarding people’s preferences for information and whether providing objective information to consumers can lead to behavior change.
External Link(s)

Registration Citation

Citation
Melo, Grace. 2025. "How does information affect decision-making?." AEA RCT Registry. March 01. https://doi.org/10.1257/rct.15126-1.1
Experimental Details

Interventions

Intervention(s)
Participants will be invited to take part in an experiment on “economic decision-making.” The study will begin with an introduction to lab-grown meat, followed by elicitation of participants' prior beliefs regarding its environmental and health impacts, as well as their willingness to pay (WTP) to learn findings from related impact research. Next, we will disclose the results of the incentivized elicitation approach and subsequently measured participants' stated WTP for lab-grown meat. Finally, we will elicit their posterior beliefs to evaluate how the provided information influenced their perceptions.

The information intervention is the information on the evidence brief given to those participants who purchase evidence brief based on the BDM approach.
Intervention (Hidden)
Intervention Start Date
2025-02-03
Intervention End Date
2025-03-03

Primary Outcomes

Primary Outcomes (end points)
WTP for information on the evidence brief (research evidence)
Primary Outcomes (explanation)
Participants will be asked their WTP for information on the evidence brief (research evidence) based on the BDM approach

Secondary Outcomes

Secondary Outcomes (end points)
prior and posterior beliefs
Secondary Outcomes (explanation)
We will elicit prior and posterior beliefs regarding the health and environmental impact of lab-grown meat. We hypothesize that those who receive information (purchase the research brief) will update their beliefs

Experimental Design

Experimental Design
In this section, we describe the first experiment to measure (a) individuals' demand for simple, concise research information and (b) how receiving information affects their beliefs and stated behavior. The context this experiment focused on was synthetic biology in food (i.e., lab-grown meat), a controversial topic in food policy discussions. Our primary hypothesis is that individuals value information about the health and environmental impact of lab-grown meat, update their beliefs and change their preferences, especially when information is perceived as trustworthy and provided in a visual manner (Figure 1).
Experimental setting and sample
We conducted Study 1 with a sample of 500 U.S. consumers in February 2024, using Forthright Access panelists to gather geographically dispersed estimates of consumer willingness to pay (WTP) for information across the United States. Forthright Access, a marketing firm, recruits respondents through a variety of online and offline advertising channels to create a nationally representative panel of US residents aged 18 and older.
We also conducted Study 1 with a sample of 500 communication experts (i.e., journalists ) recruited through Prolific. Prolific is an online research platform that connects researchers with high-quality participants, ensuring diverse and reliable samples for academic studies. WTP for information from communication experts can gauge how much value they place on having credible, research-backed information in their messaging.
Experimental Design Details
Experimental design
Figure 1 illustrates the structure of Study 1. Participants were invited to take part in an experiment on “economic decision-making.” The study began with an introduction to lab-grown meat, followed by elicitation of participants' prior beliefs regarding its environmental and health impacts, as well as their willingness to pay (WTP) to learn findings from related impact research. Next, we disclosed the results of the incentivized elicitation approach and subsequently measured participants' stated WTP for lab-grown meat. Finally, we elicited their posterior beliefs to evaluate how the provided information influenced their perceptions.

Survey experiment sections

The survey, provided in the Appendix can be summarized as follows:
Introduction. We described lab-grown meat, highlighting the factors being considered to determine the environmental or health impact. To ease understanding of its impact, we provided current estimates of the health and environmental impact of meat.
Priors about impact. We elicited participants' prior beliefs. Specifically, we asked what they believed the impact on the environment and health is. Immediately after, we asked how sure they were about their answer to elicit confidence in priors (Dunning et al., 2019). We also asked a similar question about the expected impact of plant-based meat to determine whether participants have a greater knowledge on this topic compared to relative to lab-grown meat and whether lab-grown meat information has spillovers on beliefs of plant-based meat impacts before and after information is provided. WTP for information and belief updating. After the participants reported their priors, we offered the chance to purchase the findings from one of the four randomly chosen science-based report. The experimental currency in which we elicited WTP consisted of lottery tickets, which also incentivized participation. We initially endowed each participant with 10 such lottery tickets, each with a chance of winning one of 25 $30 e-gift cards. Participants could save their lottery tickets for the lucky draw or use some, or all of them, to learn about the environmental impact of lab-grown beef. Following a Becker-DeGroot-Marschak elicitation procedure (BDM), we measured the participant’s maximum WTP [0 to 10] to find out the results of the relevant study. We then drew a randomized price for the study. If the price was below the participant’s WTP, we revealed the findings and deducted the price from the participant’s stock of lottery tickets. To ensure that we observed belief updating for most participants, while maintaining incentive compatibility in the BDM procedure, the price was drawn from a distribution with high mass at zero ((Hjort et al., 2021). Consequently, we expect a high percentage of participants to receive the information regardless of their WTP. We emphasize this sample for the belief-updating analysis since these participants receive the information without selection. For those who received the information, we subsequently elicited posterior beliefs about the expected impact of lab-grown meat and plant-based meat that was not offered for purchase in this task. Before the WTP question, participants completed a standard attention check designed to assess how attentive participants were in the experiment.

Contingent valuation. At the very end, we elicited consumers’ willingness to pay (WTP) for two alternatives of burger patties: one made of lab-grown beef using a payment card. After the respondents selected an interval, they were asked to indicate the precise amount of money (Pavlova et al., 2004). This combined elicitation method offered two important advantages. First, it facilitated the respondents’ answers by providing a price tag (Baji et al., 2014), while addressing value cues and the starting point biases associated with other methods (closed-ended questions and the bidding games). Second, it allows the stated WTP to be measured on a continuous scale (Whynes et al., 2003).
Respondents were asked to choose a payment scale from a list of price intervals that could best reflect the maximum price he/she is willing to pay for 1.33 pounds—equivalent to 4 1/3-pound patties—of the meat alternatives under consideration. Then they were asked to point out how much they would pay. Respondents were informed about the average price of 1.33 pounds of conventional ground beef—about eight dollars—in the US market in August 2024. The payment card itself includes 13 price intervals, ranging from 0 to 60, with the length of each interval being five dollars (B. Chen et al., 2023). If the respondent’s WTP is above 60 dollars, he/she can choose the price interval of 60 dollars or above. To minimize the hypothetical bias of this contingent valuation question, we employed a cheap talk script (Cummings & Taylor, 1999), respondents were instructed to state their WTP as if they were presented with actual products in a real shopping situation.
Exit Questions: Participants were asked to rate the reliability of the information provided on a standard 5-point scale (1 = unreliable, 5 = reliable) (Roozenbeek & Van Der Linden, 2019). Perceived naturalness for acceptance of lab-grown meat was elicited (Siegrist & Sütterlin, 2017; Wilks & Phillips, 2017) as well as questions on political affiliation and taste perception were also included. (Wilks & Phillips, 2017).

Randomization Method
BDM price will be drawn from a distribution with a mass of zeros. Participants wil
Randomization Unit
The experimental currency in which we elicited WTP consisted of lottery tickets, which also incentivized participation. We initially endowed each participant with 100 such lottery tickets, each with a chance of winning a one of 20 $2580 e-gift card. Participants could save their lottery tickets for the lucky draw or use some, or all of them, to learn the estimated effect size of the study. Following a Becker-DeGroot-Marschak elicitation procedure (BDM), we measured the participant’s maximum WTP [0 to 100] to find out the results of the relevant study. We then drew a randomized price for the study. If the price was below the participant’s WTP, we revealed the findings and deducted the price from the participant’s stock of lottery tickets. To ensure that we observed belief updating for most participants, while maintaining incentive compatibility in the BDM procedure, the price was drawn from a distribution with high mass at zer o.
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
Na
Sample size: planned number of observations
About 500 US consumers, including those who usually lack access to research (e.g., low literacy), participated in this experiment conducted online.
Sample size (or number of clusters) by treatment arms
Total 500 US Consumers
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Information interventions led to small-to-medium effects (d = 0.2–0.4) on risk perception and trust, particularly when scientific or expert-based information was provided. Sample sizes of 50–200 per group are required for small and medium effects respectively. In a similar study with decision-makers in another context, the sample size was 900 individuals (Hjort et al., 2021).
IRB

Institutional Review Boards (IRBs)

IRB Name
IRB Approval Date
IRB Approval Number
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials